Related
Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.
I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.
I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm
Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy
voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.
well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.
Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!
This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?
gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.
The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...
Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.
thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.
I'm probably the only person on this planet that would ever download a 20.5-meg, 2426-page document titled "S5PC110 RISC Microprocessor User's Manual", but if there are other hardware freaks out there interested, here you go:
http://pdadb.net/index.php?m=repository&id=644&c=samsung_s5pc110_microprocessor_user_manual_1.00
As you may or may not know, the S5PC110, better known as Hummingbird, is the SoC (System on a Chip) that is the brain of your Epic. Now, when you have those moments when you really just gotta know the memory buffer size for your H.264 encoder or are dying to pore over a block diagram of your SGX540 GPU architecture, you can!
( Note: It does get a little bit dry at parts. Unless you're an ARM engineer, I suppose. )
Why arent you working on porting CM6 or gingerbread via CM7?? lol
now we can overclock the gpu
/sarcasm
cbusillo said:
Why arent you working on porting CM6 or gingerbread via CM7?? lol
Click to expand...
Click to collapse
Hah, because I know exactly squat about Android development. Hardware is more my thing, though if I find some spare time to play around with the Android SDK maybe that can change.
Sent from my SPH-D700 using XDA App
This actually is really exciting news. RISC architectures in general, especially the ARM instruction set is great and honestly it would so the works a lot of good kicking the chains of x86
Sent from my Nexus S with a keyboard
Interesting - the complete technical design of the Hummingbird chips.
After reading your blog as to how Hummingbird got its extra performance, I still wonder at times - did we make the right choice in getting this phone the Epic 4G (I bought one for $300 off contract and imported it to Canada) knowing that there are going to be ARM Cortex A9 CPUs coming around in just a couple of months? We know that in the real world, Hummingbird is more powerful than Snapdragon and the OMAP 3600 series, while benchmark scores tend to not reflect real world performance.
Performance-wise: It's know that the out of order A9 parts are at least 30% faster clock for clock in real world performance. There will be dual and maybe quad core implementations. What's really up in the air is the graphics performance of the A9 parts. There's now the Power VR SGX 545, the Mali 400, and the Tegra 2.
Edit: There is also the successor, the Mali T-604. I don't expect to see this in a phone in the near future. Nor do I expect the Tegra 3. Maybe close to this time next year though.
sauron0101 said:
Interesting - the complete technical design of the Hummingbird chips.
After reading your blog as to how Hummingbird got its extra performance, I still wonder at times - did we make the right choice in getting this phone the Epic 4G (I bought one for $300 off contract and imported it to Canada) knowing that there are going to be ARM Cortex A9 CPUs coming around in just a couple of months? We know that in the real world, Hummingbird is more powerful than Snapdragon and the OMAP 3600 series, while benchmark scores tend to not reflect real world performance.
Performance-wise: It's know that the out of order A9 parts are at least 30% faster clock for clock in real world performance. There will be dual and maybe quad core implementations. What's really up in the air is the graphics performance of the A9 parts. There's now the Power VR SGX 545, the Mali 400, and the Tegra 2.
Edit: There is also the successor, the Mali T-604. I don't expect to see this in a phone in the near future. Nor do I expect the Tegra 3. Maybe close to this time next year though.
Click to expand...
Click to collapse
Your always going to be playing catchup..I personally think the Epic has great hardware for the time...I mean on Samsung's roadmap for 2012/13 is their Aquila processor which is a quad-core 1.2ghz..its going to be endless catchup..every year there will be something that completely over shallows the rest..
gTen said:
Your always going to be playing catchup..I personally think the Epic has great hardware for the time...I mean on Samsung's roadmap for 2012/13 is their Aquila processor which is a quad-core 1.2ghz..its going to be endless catchup..every year there will be something that completely over shallows the rest..
Click to expand...
Click to collapse
No, but I mean, if you buy the latest technology when its released, you'll be set for quite some time.
For example, if you were to buy the one of the first Tegra 2 phones, its unlikely that anything is going to be beating that significantly until at least 2012 when the quad core parts begin to emerge.
It takes a year or so from the time that a CPU is announced to the time that it gets deployed in a handset. For example, the Snapdragon was announced in late 2008 and the first phones (HD2) were about a year later. IF you buy an A9 dual core part early on, you should be set for some time.
Well, I got the Epic knowing Tegra 2 was coming in a few months with next-gen performance. I was badly in need of a new phone and the Epic, while not a Cortex A9, is no slouch.
Sent from my SPH-D700 using XDA App
sauron0101 said:
No, but I mean, if you buy the latest technology when its released, you'll be set for quite some time.
For example, if you were to buy the one of the first Tegra 2 phones, its unlikely that anything is going to be beating that significantly until at least 2012 when the quad core parts begin to emerge.
Click to expand...
Click to collapse
Thats relative, in terms of GPU performance our Hummingbird doesn't do so badly..the GPU the TI chose to pair with the dual core OMAP is effectively a PowerVR SGX540..the Snapdragon that is rumored to be in the dual cores next summer is also on par with our GPU performance...so yes we will loose out to newer hardware..which is to be expected but I wouldn't consider it a slouch either...
It takes a year or so from the time that a CPU is announced to the time that it gets deployed in a handset. For example, the Snapdragon was announced in late 2008 and the first phones (HD2) were about a year later. IF you buy an A9 dual core part early on, you should be set for some time.
Click to expand...
Click to collapse
The first phone was a TG01, that said I guarantee you that a year if not less from the first Tegra release there will be a better processor out...its bound to happen..
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
gTen said:
Thats relative, in terms of GPU performance our Hummingbird doesn't do so badly..the GPU the TI chose to pair with the dual core OMAP is effectively a PowerVR SGX540..the Snapdragon that is rumored to be in the dual cores next summer is also on par with our GPU performance...so yes we will loose out to newer hardware..which is to be expected but I wouldn't consider it a slouch either...
The first phone was a TG01, that said I guarantee you that a year if not less from the first Tegra release there will be a better processor out...its bound to happen..
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
Click to expand...
Click to collapse
AFAIK, dual-core support is only fully supported by Honeycomb. But if you feel like buying into NVIDIA's explanation of Tegra 2 performance, check this out: http://www.nvidia.com/content/PDF/t...-Multi-core-CPUs-in-Mobile-Devices_Ver1.2.pdf
Electrofreak said:
AFAIK, dual-core support is only fully supported by Honeycomb. But if you feel like buying into NVIDIA's explanation of Tegra 2 performance, check this out: http://www.nvidia.com/content/PDF/t...-Multi-core-CPUs-in-Mobile-Devices_Ver1.2.pdf
Click to expand...
Click to collapse
I see I actually read before that Gingerbread would allow for dual core support but I guess that was delayed to honeycomb...
either way this would mean even if a Tegra based phone comes out it wont be able to utilize both cored until at least mid next year.
I can't open pdfs right now but I read a whitepaper with performance of hummingbird and Tegra 2 compared both on single core and dual core..is that the same one?
One thing though is Nvidia and ATI are quite known for tweaking their gfx cards to perform well on benchmarks...I hope its not the same with their CPUs :/
gTen said:
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
Click to expand...
Click to collapse
Here are some additional benchmarks comparing the Galaxy Tab to the Viewsonic G Tablet:
http://www.anandtech.com/show/4062/samsung-galaxy-tab-the-anandtech-review/5
It's possible that the Tegra 2 isn't optimized yet. Not to mention, Honeycomb will be the release that makes the most of dual cores. However, there are lackluster performance gains in terms of graphics - most of it seems to be purely CPU gains in performance.
I'm not entirely sure that Neocore is representative of real world performance either. It's possible that it may have been optimized for some platforms. Furthermore, I would not be surprised if Neocore gave inflated scores for the Snapdragon and it's Adreno graphics platform. Of course, neither is Quadrant.
I think that real world games like Quake III based games are the way to go, although until we see more graphics demanding games, I suppose that there's little to test (we're expecting more games for Android next year).
Finally, we've gotten to a point for web browsing where its the data connection HSPA+, LTE, or WiMAX that will dictate how fast pages load. It's like upgrading the CPU for a PC. I currently run an overclocked q6600 - if I were to upgrade to say a Sandy Bridge when it comes out next year, I don't expect significant improvements in real world browsing performance.
Eventually, the smartphone market will face the same problem that the PC market does. Apart from us enthusiasts who enjoy benchmarking and overclocking, apart from high end gaming, and perhaps some specialized operations (like video encoding which I do a bit of), you really don't need the latest and greatest CPU or 6+ GB of RAM (which many new desktops come with). Same with high end GPUs. Storage follows the same dilemna. I imagine that as storage grows, I'll be storing FLAC music files instead of AAC, MP3, or OGG, and more video. I will also use my cell phone to replace my USB key drive. Otherwise, there's no need for bigger storage.
gTen said:
I see I actually read before that Gingerbread would allow for dual core support but I guess that was delayed to honeycomb...
either way this would mean even if a Tegra based phone comes out it wont be able to utilize both cored until at least mid next year.
I can't open pdfs right now but I read a whitepaper with performance of hummingbird and Tegra 2 compared both on single core and dual core..is that the same one?
One thing though is Nvidia and ATI are quite known for tweaking their gfx cards to perform well on benchmarks...I hope its not the same with their CPUs :/
Click to expand...
Click to collapse
Gingerbread doesn't have any dual-core optimizations. It has some JIT improvements in addition to some other minor enhancements, but according to rumor, Honeycomb is where it's at, and it's why the major tablet manufacturers are holding off releasing their Tegra 2 tablets until it's released.
And yeah, that paper shows the performance of several different Cortex A8s (including Hummingbird) compared to Tegra 2, and then goes on to compare Tegra 2 single-core performance vs dual.
Electrofreak said:
Gingerbread doesn't have any dual-core optimizations. It has some JIT improvements in addition to some other minor enhancements, but according to rumor, Honeycomb is where it's at, and it's why the major tablet manufacturers are holding off releasing their Tegra 2 tablets until it's released.
And yeah, that paper shows the performance of several different Cortex A8s (including Hummingbird) compared to Tegra 2, and then goes on to compare Tegra 2 single-core performance vs dual.
Click to expand...
Click to collapse
I looked at:
http://androidandme.com/2010/11/new...u-will-want-to-buy-a-dual-core-mobile-device/
since I can't access the pdf..does the whitepaper state what version they used to do their tests? for example if they used 2.1 on the sgs and honeycomb on their tests it wouldn't exactly be a fair comparison...do they also put in the actual FPS..not % wise? for example we are capped on the FPS for example...
Lastly, in the test does it say whether the Tegra 2 was dithering at 16bit or 24bit?
gTen said:
I looked at:
http://androidandme.com/2010/11/new...u-will-want-to-buy-a-dual-core-mobile-device/
Click to expand...
Click to collapse
I'm one of Taylor's (unofficial) tech consultants, and I spoke with him regarding that article. Though, credit where it's due to Taylor, he's been digging stuff up recently that I don't have a clue about. We've talked about Honeycomb and dual-core tablets, and since Honeycomb will be the first release of Android to support tablets officially, and since Motorola seems to be holding back the release of its Tegra 2 tablet until Honeycomb (quickly checks AndroidAndMe to make sure I haven't said anything Taylor hasn't already said), and rumors say that Honeycomb will have dual-core support, it all makes sense.
But yes, the whitepaper is the one he used to base that article on.
gTen said:
since I can't access the pdf..does the whitepaper state what version they used to do their tests? for example if they used 2.1 on the sgs and honeycomb on their tests it wouldn't exactly be a fair comparison...do they also put in the actual FPS..not % wise? for example we are capped on the FPS for example...
Lastly, in the test does it say whether the Tegra 2 was dithering at 16bit or 24bit?
Click to expand...
Click to collapse
Android 2.2 was used in all of their tests according to the footnotes in the document. While I believe that Android 2.2 is capable of using both cores simultaneously, I don't believe it is capable of threading them separately. But that's just my theory. I'm just going off of what the Gingerbread documentation from Google says; and unfortunately there is no mention of improved multi-core processor support in Gingerbread.
http://developer.android.com/sdk/android-2.3-highlights.html
As for FPS and the dithering... they don't really go there; the whitepaper is clearly focused on CPU performance, and so it features benchmark scores and timed results. I take it all with a pinch of salt anyhow; despite the graphs and such, it's still basically an NVIDIA advertisement.
That said, Taylor has been to one of their expos or whatever you call it, and he's convinced that the Tegra 2 GPU will perform several times better than the SGX 540 in the Galaxy S phones. I'm not so sure I'm convinced... I've seen comparable performance benchmarks come from the LG Tegra 2 phone, but Taylor claims it was an early build with and he's seen even better performance. Time will tell I suppose...
EDIT - As for not being able to access the .pdfs, what are you talking about?! XDA app / browser and Adobe Reader!
hey folks. Samsung is going back to power vr as the graphics force to power its next gen soc(A15 socs??). With HD super-amoled plus and power vr 6xx(not too sure if its 6xx or 5xx) series to power them,it will definitely be another great year for Samsung and android. I personally can't wait.
what do you guys think?
a similar soc is ST-Ericsson Nova A9600 which is also an A15 with power vr 6 serires gpu read here for discussion on this NOVA
awesome-member said:
hey folks. Samsung is going back to power vr as the graphics force to power its next gen soc(A15 socs??). With HD super-amoled plus and power vr 6xx(not too sure if its 6xx or 5xx) series to power them,it will definitely be another great year for Samsung and android. I personally can't wait.
what do you guys think?
Click to expand...
Click to collapse
Where did ya get this from? I thought it was reported that it would be Cortex A15 + Mali T604?
Logi_Ca1 said:
Where did ya get this from? I thought it was reported that it would be Cortex A15 + Mali T604?
Click to expand...
Click to collapse
i have the evidence.I'll be more than happy to share it to a mod but wont release for general public.(for obvious reasons)
Source please?
Sent from my GT-I9100 using XDA App
awesome-member said:
i have the evidence.I'll be more than happy to share it to a mod but wont release for general public.(for obvious reasons)
Click to expand...
Click to collapse
Well ok... Personally I don't care either way, I just hope they go for whatever has the best performance/power consumption ratio.
WagTwo said:
Source please?
Sent from my GT-I9100 using XDA App
Click to expand...
Click to collapse
I assure its legit. and it specifically says Samsung are moving away form mail.
Logi_Ca1 said:
Well ok... Personally I don't care either way, I just hope they go for whatever has the best performance/power consumption ratio.
Click to expand...
Click to collapse
dont care !!! remember when sgs2 was first launched how many games/apps were incompatible. significant amount of app that i'd bought while i had sgs were not working on my sgs2 things are getting better now. but using the similar gpu found in ios devices and psvita does make a difference and I as a consumer will have more option and it's not just limited to games but to all other apps that uses open gl.
awesome-member said:
I assure its legit. and it specifically says Samsung are moving away form mail.
Click to expand...
Click to collapse
It makes sense, I recently read news that Samsung licensed some PowerVR GPUs.
But I do hope it's a SGX 6XX, anything else would be a disappointment (for me).
I also hope that it's going to be a dual-core (1.6GHz) Cortex A15, if it is it will be way faster than a quadcore Cortex A9 especially when you consider that applications are only starting to support dual-cores right now.
wurzelsepp3 said:
It makes sense, I recently read news that Samsung licensed some PowerVR GPUs.
But I do hope it's a SGX 6XX, anything else would be a disappointment (for me).
I also hope that it's going to be a dual-core (1.6GHz) Cortex A15, if it is it will be way faster than a quadcore Cortex A9 especially when you consider that applications are only starting to support dual-cores right now.
Click to expand...
Click to collapse
we might still see newer quad A9 with mali but by late 2012 we should expect Samsung coming out with A15 with power vr. since A6(for ipad3) is widely rumored to be A15 and we all know who makes A5 for apple.we should see the Samsung version of A15 in 2012.
also if you remember the exynos/orion which was delayed and it was reported(not officially though) that the reason was 'problems with it graphics unit'.
i like powerVR more then mali
Any mods that can confirm the information?
Sent from my GT-I9100 using XDA App
I'd prefer if it was a power vr 6x. Easier support from developers due to iphones using power vr also.
awesome-member said:
hey folks. Samsung is going back to power vr as the graphics force to power its next gen soc(A15 socs??). With HD super-amoled plus and power vr 6xx(not too sure if its 6xx or 5xx) series to power them,it will definitely be another great year for Samsung and android. I personally can't wait.
what do you guys think?
Click to expand...
Click to collapse
I'm indifferent either way. The Mali 400 in my S2 can keep up with all the games and I'm not seeing a trend towards better quality graphics simply because the screens on our devices do not support it, this might change with ICS and the 720p screen on the Galaxy Nexus.
For the future it's wait and see. Usually better graphics means more power consumption and that's a trade-off I'm not willing to make.
They've opted for both the next gen of PowerVR and Mali chips, so it could be either.
OP is not wrong, but he is not right either.
GIR said:
I'm indifferent either way. The Mali 400 in my S2 can keep up with all the games and I'm not seeing a trend towards better quality graphics simply because the screens on our devices do not support it, this might change with ICS and the 720p screen on the Galaxy Nexus.
For the future it's wait and see. Usually better graphics means more power consumption and that's a trade-off I'm not willing to make.
Click to expand...
Click to collapse
I'm personally hoping for a usable NDS or PSP emulator in 2012.
power consumption should be fine considering that 45 SiO2 > 32/28 nm hkmg is a huge jump.
Rawat said:
They've opted for both the next gen of PowerVR and Mali chips, so it could be either.
OP is not wrong, but he is not right either.
Click to expand...
Click to collapse
i am aware that samsung is the licensee of both mali and power vr but the source specifically says that they are 'moving away from mali'. we may see few versions of exynos with mali powering some of the future devices but their flagship devices will have power VR.
WTF guys this is exclusive news to XDA and you rate it 2 stars. Just because you are not getting the news from engadget?? there were news about samsung being the licensee of both powervr and mali but i have never seen a report that says which way the samsung was heading in terms of graphics wise.
At best, this is an unsubstantiated rumour. We'll know more about Samsung's SoC plans when they unveil the Galaxy S III at MWC.
tbqh, your news hardly seems reliable, and even if it was doesn't really matter. Samsung have used PowerVR for many of their SoCs, and Mali 400 for only the Exynos 4210
Samsung has announced and is sampling two newer SOCs; Exynos 4212 and Exynos 5250.
Exynos 4210 - dual-core Cortex A9 @ 1.2ghz, Mali 400 MP4 GPU, 45nm process.
Exynos 4212 - dual-core Cortex A9 @ 1.5ghz, Mali 400 MP4 GPU, 32nm process. (GPU Speculated, not officially disclosed)
Exynos 5250 - dual-core Cortex A15 @ 2.0ghz, Mali T604 GPU, 32nm process.(GPU Speculated, not officially disclosed)
Here's the thing about sampling/testing. SOCs typically have to be sampled for 6 months (or more) before they show up in phones. 4212 started sampling in September/October and 5250 started sampling in November. That means that both should be available for the typical Galaxy S launch window (my bet is on 5250).
If Samsung does go back to PowerVR, by the time they start sampling this SOC they would have already missed the Galaxy SIII launch window. So, I find this unsourced information interesting but highly unlikely at this point. Once you can reveal your source(s) we'll be able to judge this more accurately. I appreciate the info and respect your need to conceal your source(s) at this time.
jaykresge said:
Samsung has announced and is sampling two newer SOCs; Exynos 4212 and Exynos 5250.
Exynos 4210 - dual-core Cortex A9 @ 1.2ghz, Mali 400 MP GPU, 45nm process.
Exynos 4212 - dual-core Cortex A9 @ 1.5ghz, Mali 400 MP GPU, 32nm process.
Exynos 5250 - dual-core Cortex A15 @ 2.0ghz, Mali T604 GPU, 32nm process.
Here's the thing about sampling/testing. SOCs typically have to be sampled for 6 months (or more) before they show up in phones. 4212 started sampling in September/October and 5250 started sampling in November. That means that both should be available for the typical Galaxy S launch window (my bet is on 5250).
If Samsung does go back to PowerVR, by the time they start sampling this SOC they would have already missed the Galaxy SIII launch window. So, I find this unsourced information interesting but highly unlikely at this point. Once you can reveal your source(s) we'll be able to judge this more accurately. I appreciate the info and respect your need to conceal your source(s) at this time.
Click to expand...
Click to collapse
I agree with your post.it is highly likely that sgs 3 will sport a Exynos 5250. but with in next year I think wee will see high res 3D tablet since samsung is also a TV manufacturer(we all know that 3d TV is the TV for 2012 /s) and it makes sense to lure more people in to 3D (3D ecosystem???).to achieve such graphical horsepower Power VR 6 is the only option they have. my bet is we will see highres 3d tablet by q4 of 2012 sporting an soc with power vr 6xxxx
I for one can't wait to hear about what Apples new A6 chip.
Anandtech originally reported that it was a A15 based dual core, which would be a major design win for Apple, since that would be what... 6 months before you seen any Android phone with a A15 SOC out in any substantial numbers!
But now Anandtech is reporting that Apple made their own CPU very closely built on the A15. Sort of a Krait on steroids, if you will.
That choice was apparently the only way they could get close to twice the performance without sacrificing battery life.
The GPU is either the same quad core SG 543 from the new iPad or a version of that chip with three cores. Either way, this means serious GPU muscle for the iPhone 5. I for one sure was blown away by the graphics in the Real Racing 3 game they demonstrated!
It's really exciting, cause it'll push development on all platforms forward. Its getting boring to always see the same 2-3 SOC combinations on Android phones (All the top phones have the same CPU/GPU inside of them these days), and will mean that Android handsets again have their work cut out for them in terms of catching up to Apple. Three of four GPU cores in a phone is crazy powerful!
What does people think powers the iPhone 5/A6? Higher clocked dual core A9? Quad core A9? Apples own custom CPU?
vszulc said:
I for one can't wait to hear about what Apples new A6 chip.
Anandtech originally reported that it was a A15 based dual core, which would be a major design win for Apple, since that would be what... 6 months before you seen any Android phone with a A15 SOC out in any substantial numbers!
But now Anandtech is reporting that Apple made their own CPU very closely built on the A15. Sort of a Krait on steroids, if you will.
That choice was apparently the only way they could get close to twice the performance without sacrificing battery life.
The GPU is either the same quad core SG 543 from the new iPad or a version of that chip with three cores. Either way, this means serious GPU muscle for the iPhone 5. I for one sure was blown away by the graphics in the Real Racing 3 game they demonstrated!
It's really exciting, cause it'll push development on all platforms forward. Its getting boring to always see the same 2-3 SOC combinations on Android phones (All the top phones have the same CPU/GPU inside of them these days), and will mean that Android handsets again have their work cut out for them in terms of catching up to Apple. Three of four GPU cores in a phone is crazy powerful!
What does people think powers the iPhone 5/A6? Higher clocked dual core A9? Quad core A9? Apples own custom CPU?
Click to expand...
Click to collapse
dualcore a15 1ghz but the real answer isTOTAL CRAP
Please use our sister site for discussing Apple products.
https://www.iphone-developers.com
Hey everyone.
I'm a bit lost and I don't know what to choose to buy: I9500 or I9505.
So far I know that Adreno 320 is fully OpenGL 3.0 compatible, while PowerVR SGX544MP3 not.
Adreno 320 is scoring 4 FPS more than PowerVR in T-Rex GLBenchmark 2.7.0.
PowerVR is scoring 1-2 more FPS in GLBenchmark 2.5 Egypt
Both GPU is scoring the same in Antutu and Quadrant video test, with PowerVR slightly better for few seconds (Adreno is dropping 1-2 seconds of the test to 30 FPS while PowerVR stay constant at 50-60)
In Antutu, the 3rd test (with the DNA code), Adreno 320 stays at 30-40 fps while PowerVR scores constant 60.
Both, 3dmark and glbenchmark show the PowerVR in the S4 even weaker than Nexus 4 and other chinese mobiles.
What's the deal....what the hell it's happening ? Is PowerVR that weak in the new graphic technologies but scores well in the new ones ?
Also, is there any OpenGL 3.0 benchmark so we can compare the Adreno 320 (fully OpenGL 3.0) with the PowerVR 544MP3 (OpenGL 2.0 but with some OpenGL 3.0 features thanks to an API), to see what the score and quality is ? I really want to see what that 3.0 API knows to do, as the Imagination doesn't really says what that API really do. Would there be games or apps using only OpenGL 3.0 and we will have trouble to run them because of this old GPU ?
I'm wondering...if in one year will be released an OpenGL 3.0 game, what will happens with S4 Octa ? It will not be able to play it, right ? I have no idea how that OpenGL thing works, but I remember that a game requesting DirectX 10 will not work with DirectX 9.
PowerVR really sucks. Samsung dumbs should put the PowerVR 6 "Rogue".
My opinion is that the Qualcomm scores very well, even my S3 is enough to play every single game, but the phone lags on RAM and that's why I replace it now. Buying the Octa will costs me $150 more than the Qualcomm version and I will need to send it oversea in case I will have problems and need to send it to warranty. With those $150 I can buy 2 spare battery and the Samsung S band instead getting the Octa. I want the Octa, but this phone really deserve such attention with that old rubish PowerVR GPU chip ? I don't have 4G in my area, so I don't care about the 4G, but will be nice in case I will travel somewhere with 4G, even if for me HSPA+ is enough and very fast, so the only thing counts here is the CPU, GPU and the battery life. Battery life can be solved with an additional battery, so remains the GPU and the CPU....So far A15 cores are yet very fast, but can use a lot of energy. So I can have 2 days battery life with texting and calling, but 2 hours playing games and watching 1080p videos, while with A9 I will have something similar to S3
Any developer or experienced guy here can answer me to this questions ?
Nobody ?
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Alberto96 said:
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Click to expand...
Click to collapse
Totally agree with you. I don't get it why people says the powervr is better. I see that in antutu benchmark scores better than adreno, but in GLBenchmark is awful. This is my only worry right now: what happens if we put the two gpu to do a full OpenGL ES 3.0 test? It will throw an error or will pass it, but with lower score. I don't care the score so much, but its capability to pass the test. If it pass it, I'm sold to Octa.
Also I found that Octa supports LPPDDR3 at 800Mhz, which means 12.8GB/s bandwidth, while S600 is LPPDDR3 but only at 600Mhz or so (only 9.4GB/s or something like that)
Sent from my GT-I9300 using Tapatalk 2
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
So.......i think i will buy the exynos I'm just waiting a friend reply that bought it on Expansys USA. If he receive it and is all good, i will buy it from that site. With Italian Taxes (21%) and shipping costs it will cost about 730-740€
Alberto96 said:
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
Click to expand...
Click to collapse
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Phobos Exp-Nord said:
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Click to expand...
Click to collapse
Well, when you play some heavy games you need all cores. Also is useful to use all cores when you are charging phone, without killing battery.
I need a new phone, because my Galaxy S I9000 is slow with new apps and android versions. If i buy this is useless a S800 version. CPU is fast, gpu maybe not as Adreno 330, but with overclock we can boost a lot performances.
Dude, using all eight cores will simply melt your phone in your hands LOL. You will drink S4 cocktail LOL. Quad-core is enough, but a gpu it's never. Same things are happening with the PCs. I don't need huge fps in trex, but some safe reviews and opinions from people really knows this things....but so far only you two were able to answer (I will not pretend yet that this forum is full of noobs LOL).
I want new mobile because of the lack of ram in S3, even if it's smooth for me. I was happy to hear about the Octa version, because I wanted to try something new, but I'm kinda lost now.
Alberto96, please let me know when your friend gets that i9500. I want to get it from Expansys too (I think we already talked together about this in other threads). If I will buy i9505 I will get it from Amazon Italy as it cheaper than other places
I'm just comparing:
I9500: - 1 years of warranty (overseas)
I9505 - 2 years of warranty (locally)
I9500 = I9505 + 3 additional S4 batteries with external charger
That because:
740€ = 625€ + 35€ x 3 batteries (and I will still have money for a Burger King and a Cola)
So...it's really deserve the risk ? Still nobody answered me related to OpenGL ES 3.0
S800 and Adreno 330 will not be in a Samsung device soon (maybe never) and 2.1-2.3GHz looks too much for a mobile phone. We already have warming issues with the S4 (I even have issues in S3, with the phone going warmer). Also....My laptop is a Dual-Core AMD 2.1 GHz for God sake.
@Alberto96, I beg you, when your friend gets the phone, please test it and let me know what you think ?
demlasjr said:
2.1-2.3GHz looks too much for a mobile phone
Click to expand...
Click to collapse
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Phobos Exp-Nord said:
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Click to expand...
Click to collapse
Yeah, you're right here. I don't have much knowledge relating this profile as I'm not watching anime, but seems to depending more on the GPU than CPU in S4 case. I'm really sure that Exynos Octa is able to run it, but not sure about the PowerVR. I've read that an Hi10P plays anywhere from 15-20fps (watchable, but still not that great) with a Tegra 3 quad-core overclocked at 1.6GHz, so there is still hope.
demlasjr said:
I've read that an Hi10P plays anywhere from 15-20fps
Click to expand...
Click to collapse
That's about 720p. Just asked in another thread there about 1080p - S4 cannot play it smooth enough with MX Player. It's not a question of resolution, it's a problem of use a file from 1080p home collection without any additional efforts.
We'll see, maybe later there will be an update released for such issues. I think the GPU and the CPU of both variants are capable of playing such videos.
Hey guys,
http://withimagination.imgtec.com/i...or-todays-leading-platforms#comment-880303396
jumping directly from OpenGL ES 2.0 to 3.0 would create a situation where app compatibility would be severely broken across devices. But most people update their devices every two years; by that time, PowerVR Series6 would be the dominant OpenGL ES 3.0 GPU generation shipping in most devices.
It is also important to remember that the PowerVR Series5XT GPU family has been successfully holding its own against recently released competing graphics solutions despite being released almost four years ago, which in itself is an amazing feat.
Click to expand...
Click to collapse
So....we should trust alexvoica and go forward with PowerVR SGX544MP3 even if lacks of OpenGL ES 2.0 ? He said that there was long way til OpenGL ES 2.0, but it wasn't such a big way as he said. Now every single game use OpenGL ES 2.0, I'm sure soon will be OpenGL ES 3.0 games only and not after 2 years.
get a look at this http://gfxbench.com/compare.jsp?cols=2&D1=Samsung+GT-I9500+Galaxy+S4&D2=Samsung+GT-I9505+Galaxy+S4