Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Div033 said:
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed.
Click to expand...
Click to collapse
I thought the biggest problem with the Nexus One was the limited space for system files. Other Adreno 200 devices, such as the Evo 4G, have Android 4.0 running on them, and I hear it works really well.
I know that the early Exynos found on the Nexus S also works quite well.
I think any modern chipset easily surpasses the performance required for the type of GPU tasks being implemented at the system level. Games are still a concern, but compatibility is more of an issue there than performance, and the Adreno 225 is popular enough that it should be supported.
But there's always next year for really kick-ass GPUs.
Div033 said:
Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
Well, you also have to look at resources available and time constraints. The introduction of LTE in the US probably forced said chip maker to make some concessions. What they lost in state of the art GPU, the gained in the ridiculous profit they made this year because they are the only chip that includes LTE. From their perspective, they've won the war thus far.
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
real world performance > benchmarks
My Sensation has an Adreno220 and it plays every game and movie just fine. Sure it doesn't get the best benchmark numbers but it more than holds it's own when playing any game. I'm sure the Adreno225 will hold up just fine over the next couple years. In fact I still love my Sensation. Side by side it's still just as fast as most phones out there. You only see a difference when running benchmarks which isn't practical. I personally don't care if i'm getting 200fps or 50. It's not like anyone can tell the difference.
I also want to note that the 220 is crazy powerful compared to the 200 and 205. It was the first GPU that Qualcomm seemed to really take a stab at gaming with. I'm fine with the 220 and can't wait to begin using the 225.
bradleyw801 said:
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
Click to expand...
Click to collapse
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu, Ram, Resolution etc... have a ton to do with it as well.
Will the GPU do better in this phone due to the extra ram compared to the one series with the same S4/225 combo?
Div033 said:
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
You should also note that the GS2 only had 480 x 800 (384000 total pixels) resolution and even at that way lower resolution it's score was only slightly higher in that test whereas the GS3 is pushing 720x1280 (921600 total pixels). That means that the GS3 is working 2.4 times harder than the GS2 and it delivers almost the same gaming performance at worst, and better performance in others. That's not bad if you ask me seeing as how we all thought the GS2 was a powerhouse just 12 months ago.
incubus26jc said:
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu and Ram have a ton to do with it as well.
Click to expand...
Click to collapse
agreed, i haven't seen anyone suffering from GPU woes other than benchmark nuts who obsess over the highest score.....everyone actually using it for real things say it works great.....and honestly, my take is if you want gaming performance, don't use a phone, plug your 360 into your big screen and kick ass from the couch in HD and surround sound
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
Well it might interest you to know the Adreno 225 supports DirectX 9.3 and texture compression where Mali 400 does not. Its a requirement for Windows 8. Now, you might say so what....but I for one plan on trying to dual boot or even run a version of Windows RT perhaps on a virtual machine. Something else that the S4 Krait/Adreno package supports natively I do believe, that the Exynos/Mali doesn't.
Sent from my DROIDX using xda premium
Div033 said:
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
Click to expand...
Click to collapse
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Voltage Spike said:
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Click to expand...
Click to collapse
+1. I do believe more ram (largest among this generation phones) matters for long term usage.
The GPU is fine.
Guys, this is a Galaxy S phone. The newest one at least.
It is GUARANTEED a Jelly Bean update from Samsung (albeit late). It is also most likely getting at least 1 or 2 more major Android updates because of XDA.
Remember, ALL OF US has the SAME Galaxy S3. That is a LOT of devs that will be working on it.
Don't worry about that. It will come with time.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
nativestranger said:
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
Click to expand...
Click to collapse
Fair enough. I suppose you're right, the Adreno 200 was already severely underpowered at launch. The 225 may not be the best, but it's still up among the top tier GPUs. I guess I have nothing to worry about. The 2GB ram is definitely nice too.
Sent from my Droid Incredible using the XDA app.
Just put this here for a reference:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The nexus one is running the Adreno 200. The htc one v with Adreno 205 is over 5X faster. The rezound has an Adreno 220. over 3X faster than the one V but also running more than 2X the resolution. The GS3 with adreno 225 is hard up against the vsync wall in he hoverjet test and about 3x faster than the rezound in the egypt test. It's amazing how much Adreno has upgraded in just 2 years. From the 2fps on the 200 to never dropping below 60fps on the 225.
Thank you this helps me make my decision too ^. Also does having a high resolution screen make graphics look better? Like NOVA 3 on my SGS2 looks awesome. All the effects are there bullet smoke and what not. So will these effects or the graphics in general look better on this sgs3 screen?
Thanks!
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Cruiserdude said:
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Click to expand...
Click to collapse
I see lol well that's good I don't wanna have to buy a new phone every half a year! But will the HD resolution make any of the game loft games look any better than they do on my galaxy s2 with the Mali 400 gpu? Thanks!
Sent from my SPH-D710 using XDA Premium App
Related
Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.
I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.
I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm
Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy
voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.
well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.
Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!
This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?
gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.
The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...
Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.
thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.
According to Microsoft QSD8250 is the chipset. Now how bad is it? I see people are saying it'd be better than HD2 since it'll have the perfect drivers from MS, but still wonder how this compare with the phone I am planning to get, Captivate, or an iPhone 4.
What prompted MS to choose this over so many newer (and possibly better) options?
rexian said:
What prompted MS to choose this over so many newer (and possibly better) options?
Click to expand...
Click to collapse
My guess: WP7 has been in development for quite some time, so at the start of development they choose the top processor that was available. But I think that this forum focuses to much on the processor and specifications, because in the end, the whole package must be convincing and that includes the operating system that has been optimized for this processor.
Furthermore, the current specifications will be the lowest common denominator for quiet some time (perhaps until WP8) and all apps will be optimized to run satisfactory on this specification (AFAIK the 20 second start-up rule for apps will be measured with the current specification). Newer processors may speed some things up, but the current hardware will be the target platform...
The development must have started before this chipset was launched, but you are right - this was most likely the target platform.
There are not many 3D games available though, the basic working will be fluid I know when I check at the store in few days. My worries are about the 3D games that will be launched later. If the experience with those is not as good as other platforms, MS will be in trouble. Better hardware will fix the issue in future but the reputation will be ruined and be stuck for a while.
Captivate is more powerful, mainly due to its GPU being about 4 times more powerful than the qsd8250s adreno200 gpu. Though, all WP7 devices will have better looking games since Captivate runs android... And everyone knows android games look crap, no matter how how powerful the hardware is (due to devs having to make their devices run on low end hardware to get more sales)
The IP4 is a better comparison because it's hardware and software have been fully engineered to run along each other, very much like WP7 devices. While it does have a more powerful GPU compared with the QSD, there wouldn't be much difference; the adreno 200 pushes about 22million triangles per sec, where as the sgx535 pushes about 28million triangles per sec. Whether developers even use all those polygons, I'm not sure I've seen.
Though epic citadel on iOS as well as this upcoming game called Aralon sure looks good.
Aralon link: http://www.gizmodo.com.au/2010/10/oh-man-aralon-for-ios-is-gonna-be-good/
Thanks Cruzer. Now it makes sense. 22 mil vs 28 mil is not a big difference. Were they running at the same clock-speed? I hear A4 processor in iPhone 4 runs at ~800MHz, so may be they both perform in a similar manner.
Not sure how much the GPU is affected by the CPU. I think it's more to do about the speed of the actual GPU, but don't take me on that quote lol.
I have a Captivate and an iPhone 4. Im getting rid of both of them to get a HD7 or Focus. The iphone works flawlessly and isnt buggy in the slightest bit, the captivate is very choppy and i couldnt take it after a while with the lagging even after i upgraded to froyo. I would go with wp7 to be different and because it looks fun even if it uses an older processor. The hummingbird and A4 are both top of the line and its going to be hard to compete especially with each having a different os.
Writing this from my iphone 4
They've unveiled it today
http://www.engadget.com/2013/01/06/nvidia-tegra-4-official/
and apparently it's much powerful and faster than the eqynox on the nexus 10, but I don't know that much about this kind of tech, I'm probably finally going to buy the Nexus 10 this week if Samsung doesn't unveil a more powerful tablet, so I was wondering if this Tegra 4 processor is worth waiting for until it's implemented on a tablet.
May TEGRA 3 Rest in Peace ...
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Sent from my GT-I9100 using Tapatalk 2
Yes that thing is packing heat. Best case, the next device with a tegra 4 will come out next Christmas. Unless they've been hiding something.
cuguy said:
Yes that thing is packing heat. Best case, the next device with a tegra 4 will come out next Christmas. Unless they've been hiding something.
Click to expand...
Click to collapse
It will be out somewhere b/w June and August maybe..
It will not take that long ...
Sent from my GT-I9100 using Tapatalk 2
i think march....mark my words
Their browser test is having the Nexus 10 run Chrome while the Tegra runs AOSP. In my eyes that makes it a 100% unfair comparison.
Between bad experiences with Tegra 2 and 3 (Atrix/TF700) and their requirement that Tegra optimized games not run on other SoC vendors without any real reason other than because they can, I cant even consider a mobile Nvidia device. All they're good for is keeping the more reputable chip makers on their toes.
yes it's nice
Would be interesting to see this with both devices running the AOSP browser! From my experience it is much faster than the current chrome version (which still is version 18 on android, compared to 23 on desktop). Maybe the Tegra4 would be faster aswell, but not that much.
Everything on my N10 is extremely fast and fluid, so I wouldn't wait for whenever the first Tegra4 devices will be available. Plus its Nexus so you know what you are buying!
Jotokun said:
Their browser test is having the Nexus 10 run Chrome while the Tegra runs AOSP. In my eyes that makes it a 100% unfair comparison.
Between bad experiences with Tegra 2 and 3 (Atrix/TF700) and their requirement that Tegra optimized games not run on other SoC vendors without any real reason other than because they can, I cant even consider a mobile Nvidia device. All they're good for is keeping the more reputable chip makers on their toes.
Click to expand...
Click to collapse
Agreed, they're making an Apples and Pears comparison that was undoubtedly set to show the new processor in a good light. It's only to be expected, it is a sales pitch after all. It will no doubt be a faster chip though.
Sent from my Nexus 10 using XDA Premium HD app
I would much rather see a couple benchmark runs myself. A time comparison of a web browser is no way to test the power of a new chipset.
Still, I would expect Tegra 4 to be WAY WAY WAY more powerful than the Exynos 5250. Both devices use the A15 architecture, and Tegra 4 has twice as many CPU cores as we have. This alone is already a big boost in multithreaded apps. Then look at the GPU where you cant even compare the two at all except by end result numbers. They are just far too different. We have 4 GPU cores, Tegra 4 has 72 GPU cores. But those cores are designed far differently and not nearly as powerful per core. It is all about the companies definition of what a GPU "core" is. And then you have a smaller process node as well, which by itself already promises to use less power than the larger process node the Exynos 5250 uses.
I would honestly expect the Tegra 4 chipset to completely destroy our tablet in terms of performance, but a much more fair comparison would be to compare the Tegra 4 to the Exynos 5 quad. Those two are actually designed to compete with each other.
If you want to compare exynos and tegra 4 then wait for exynos 5450 (quad a15) which should come with galaxy s4 no of cores makes a difference here t4 is quad but early gl benchmarks show that A6X and exynos 5250 have a better GPU
First Tegra 4 Tablet running stock android 4.2:
http://www.theverge.com/2013/1/7/3845608/vizio-10-inch-tablet-combines-tegra-4-android-thin-body
Lets hope and pray Vizio managed to add microSD capability to it. Free of Nexus restraints, it should be doable, but since it is running stock JB, the storage would have to be mounted as USB (?) . So far this Vizio 10" was the most exciting Android development out of CES. We have few more our of Press events scheduled in Vegas and then it will be all over
rashid11 said:
Lets hope and pray Vizio managed to add microSD capability to it. Free of Nexus restraints, it should be doable, but since it is running stock JB, the storage would have to be mounted as USB (?) . So far this Vizio 10" was the most exciting Android development out of CES. We have few more our of Press events scheduled in Vegas and then it will be all over
Click to expand...
Click to collapse
Dont expect the Nexus advantage of up-to-date software or timely updates.
EniGmA1987 said:
I would much rather see a couple benchmark runs myself. A time comparison of a web browser is no way to test the power of a new chipset.
Still, I would expect Tegra 4 to be WAY WAY WAY more powerful than the Exynos 5250. Both devices use the A15 architecture, and Tegra 4 has twice as many CPU cores as we have. This alone is already a big boost in multithreaded apps. Then look at the GPU where you cant even compare the two at all except by end result numbers. They are just far too different. We have 4 GPU cores, Tegra 4 has 72 GPU cores. But those cores are designed far differently and not nearly as powerful per core. It is all about the companies definition of what a GPU "core" is. And then you have a smaller process node as well, which by itself already promises to use less power than the larger process node the Exynos 5250 uses.
I would honestly expect the Tegra 4 chipset to completely destroy our tablet in terms of performance, but a much more fair comparison would be to compare the Tegra 4 to the Exynos 5 quad. Those two are actually designed to compete with each other.
Click to expand...
Click to collapse
look at the new iphone only 2 cores (different architecure) beating the higher clock double cored galaxy s3 in some disciplines..
this presentation is scientifically SO SO irrelevant especially becasue they use different software.. I LOL SO HARD at ppl thinking this is anywhere near to be comparable
schnip said:
look at the new iphone only 2 cores (different architecure) beating the higher clock double cored galaxy s3 in some disciplines..
this presentation is scientifically SO SO irrelevant especially becasue they use different software.. I LOL SO HARD at ppl thinking this is anywhere near to be comparable
Click to expand...
Click to collapse
The new iPhone 5 doesnt use the same ARM architecture are the S3 though, it is a custom design. So those can be compared against each other fine to see which architecture is better. And if a slower clock speed CPU gets better scores then we know it is a superior design. This is the basis of all benchmarking. If we were only allowed to compare the exact same architectures together then we wouldnt learn anything.
Tegra4 uses a (probably slightly modified) A15 core, and the Exynos 5xxx uses a fairly stock A15 core. So a higher clocked A15 should beat out a lower clocked A15 in a direct comparison no matter what. Then when you throw 2 additional cores on top it should always win in multithreaded benchmarks too. Seems pretty common sense to me.
The main difference will be in the graphics side of things, where Nvidia has their own designed GPU compared to Samsung's use of the Mali GPU's.
You can still compare them together just fine, it just need to be both of them on the same browser if there is a browser comparison being done. In this PR release, Nvidia skewed the results like all companies do. So we cant really see the difference between the two from those pictures and we need to wait for 3rd party review sites to do proper testing to see actual results. Yet we can still estimate performance plenty fine since we have a baseline of the architecture already with this tablet.
"Tegra 4 more powerful than Nexus 10"... well duh! It's a new chip just unveiled by nvidia that won't show up in any on sale devices for at least a couple of months. Tablet and smartphone tech is moving very quickly at the moment, nvidia will hold the android performance crown for a couple of months and then someone (probably samsung or qualcomm) will come along with something even more powerful. Such is the nature of the tablet/smartphone market. People that hold off on buying because there is something better on the horizon will be waiting forever because there will always be a better device just a few months down the line!
EniGmA1987 said:
The new iPhone 5 doesnt use the same ARM architecture are the S3 though, it is a custom design. So those can be compared against each other fine to see which architecture is better. And if a slower clock speed CPU gets better scores then we know it is a superior design. This is the basis of all benchmarking. If we were only allowed to compare the exact same architectures together then we wouldnt learn anything.
Tegra4 uses a (probably slightly modified) A15 core, and the Exynos 5xxx uses a fairly stock A15 core. So a higher clocked A15 should beat out a lower clocked A15 in a direct comparison no matter what. Then when you throw 2 additional cores on top it should always win in multithreaded benchmarks too. Seems pretty common sense to me.
The main difference will be in the graphics side of things, where Nvidia has their own designed GPU compared to Samsung's use of the Mali GPU's.
You can still compare them together just fine, it just need to be both of them on the same browser if there is a browser comparison being done. In this PR release, Nvidia skewed the results like all companies do. So we cant really see the difference between the two from those pictures and we need to wait for 3rd party review sites to do proper testing to see actual results. Yet we can still estimate performance plenty fine since we have a baseline of the architecture already with this tablet.
Click to expand...
Click to collapse
That was kind of his point.
I don't think anyone is denying that the tegra will be faster. What's being disputed here is just how much faster it is. Personally, I don't think it'll be enough to notice in everyday use. Twice the cores does not automatically a faster CPU make, you need software that can properly take advantage and even then is not a huge plus in everyday tasks. Also, in the past Nvidia has made pretty crappy chips due to compromise. Good example being how the tegra 2 lacked neon support. The only concrete advantages I see are more cores and a higher clock rate.
Based on the hype : performance ratio of both Tegra 2 & 3 I wouldn't have high hopes until I see legit benchmark results.
What does seem promising though, is the fact that they are making more significant changes than from T2 to T3, such as dual channel memory (finally after 1-2 years of all other SoCs having it -.-) and the GPU cores are different too.
Still, the GPU has always been the weakest point of Tegra, so I still don't think it can beat an overclocked T-604 by much, even though this time around they will not be the first ones to debut a next-gen SoC. Given the A15 architecture they can't really screw up the CPU even if they wanted to, so that should be significantly faster than the Exynos 5 Dual.
I've also just read this article on Anandtech about power consumption and the SoC in the Nexus 10 consumes multiple times as much power as other tablet chipsets, making me wonder how nVidia plans to solve the battery life issue with 2 times as many cores and a (seemingly) beefier GPU, not even mentioning implementation in phones..
freshlysqueezed said:
First Tegra 4 Tablet running stock android 4.2:
http://www.theverge.com/2013/1/7/3845608/vizio-10-inch-tablet-combines-tegra-4-android-thin-body
Click to expand...
Click to collapse
Apparently this is the tablet that comes to take Nexus 10 spot, Vizio 10 inch tablet with Tegra 4, 2560 x 1600 resolution, 32gb storage, Android 4.2. And it should be coming out Q1 2013, I think this one makes me wait to hear some more about it until I buy the Nexus 10, although to be honest the brand is a bit of a let down for me.
Edit: the 10-inch model, key specs (aside from Tegra 4) include a 2,560 x 1,600 display, 32GB of on-board memory, NFC and dual 5MP / 1.3MP cameras.
http://www.engadget.com/2013/01/07/vizio-10-inch-tegra-4-tablet-hands-on/
All HTC M9+ owners!
We're a handful yet on XDA, but getting more and more as the device is rolling out to more locations. and people who look for an elegant, high end device with premium build quality and the extra features like fingerprint scanner and 2K display and the duo camera settle with this fantastic device. It's not the perfect for everything unfortunately, not the best gaming phone out there, but in my experience a very well performing device in terms of phone call quality, reception quality, wifi power, multimedia, battery, display panel and overall UX feel, smoothness. High end games of latest year 2015 might stutter a bit here and there or lack some important shaders to compensate, but nonetheless good for last years (2014 and before 3D gaming and all kinds of 2D games), let's gather the experience and benchmarks of this unique device which was maybe the first MTK device in alubody
Let's discuss performance related experience, real user feel/experience and benchmarks free of whining, facing the truth that in some respects it's not the top-notch device, but let the curious ones who consider to buy this device, know what to expect if choosing this elegant business class phone.
I'll start with some of my benchmarks result screenshots in separate posts.
UPDATE:
Here's my short game testing video on M9+
https://www.youtube.com/watch?v=QmLGCoI4NLw
Antutu on stock Europe base 1.61
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
My real life experience is that in everyday chores, the phone is very snappy, with some small lags when starting new apps to the memory. Task switching is quite good.
There's unfortunately some memory leak issue with 5.0.x Android version which after a while kicks in (on most 5.0.x devices as well), but overall the UX smoothness is just quite good. Occasionally there are some apps that brings up popup windows a bit stuttery, like facebook comments animation tends to be a bit stuttery.
The Sense Home/Blink Feed experience is just perfect. At normal operations, when no big application updates are happening in the background, I never faced any lags on the Sense Home UI.
As for the games, games from and before 2014 run perfectly. New ones might get some shaders removed, or with reduced polygon counts, so don't expect a jaw dropping 3D experience. If you are ok with last years (2014 and before) 3d game quality, M9+ is quite good, but latest games will be most probably running in the dumbed down mode accordingly to the PowerVR GPU the M9+ with mtk chipset has. The phone keeps a good temperature mostly, except when charging battery and at the same time playing 3D heavy games. (But that's expected from most devices, especially with alu body)
The screen quality is quite good, i've got a perfect panel with the first unit I got, and the refresh rate is 60hz, smooth and with lovely dpi and brightness of 2K.
The benchmarks show a general good performance with operations bound to the CPU. Where the MTK chip comes shorthanded is the GPU part. All tests show 3D performance that is below 2014's flagships performance by far. The GPU PowerVR 6200 is the same that was built into iPhone5s, which let's face is a mediocre GPU for a 2K display. If you face this fact and accept that gaming won't be with the highest quality textures and shaders, probably you won't be disappointed.
I'm curious what others think...
Played with it for a few hours. First impression is that it's clearly slower then the M9. The fingerprint sensor is almost perfect, a few hit and misses. I'll be back after a few days of using it with a more adequate feedback.
Sent from my HTC_M9pw using Tapatalk
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
DeadPotato said:
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
Click to expand...
Click to collapse
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
tbalden said:
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
Click to expand...
Click to collapse
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
DeadPotato said:
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
Click to expand...
Click to collapse
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
tbalden said:
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
Click to expand...
Click to collapse
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
DeadPotato said:
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
Click to expand...
Click to collapse
Second you.
X10 is MTK's first high-end tier SoC. They should equip with high-end graphics card to become well-known,not two-years ago one same with iPhone5s era. Although the most casual gaming is OK,but it is not covered with "flagship" devices.
It's excellent there were some possibility to tweak it on kernel level. But the Mediatek Helio X20 should work better,better & better on GPU part.
Not heavy gamer,the UX is pretty good for me. (Camera is another topic.)
Even I measured the CPU speed with Chess game,the result reached the top-tier.
tbalden said:
Antutu on stock Europe base 1.61
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
Click to expand...
Click to collapse
FYI.
Mine : stock , Taiwanese base 1.08,rooted device
Sigh.... the result was maxed out for my device on Ice Storm "Extreme".
Very similar result as mine was except ice storm unlimited. I think it might be related to temperature throttling, I'll test it again later.
Yeah, second run on base 1.61
Hello guys,
I live in France and i really look forward the HTC m9+ ; i thinks it is what the M9 should have been, but i don't understand HTC 2015 sh*tty choices of everything.
WHat i would like to know is about battery life ; Can you guys tell me whats about it ?
I just bought galaxy s6 edge a few days ago and it's creepy. I have no battery in the end of afternoon. I don't play games, just few photos and browsing, and messenger sometimes. And anyways i miss boomsound and metal body.
I'm so desapointed. Should never have sold my old one m8 or xperia Z3.
My only hope is turned on lenovo vibe x3 but there is no news since march ...
Thanks guys !
After installing more and more software that is synchronized, like dropbox and a few other my battery life got lower a notch, from 23+ hours to 22+ and average screen on around 4 hrs. All in all, not a miracle, but that's what I usually got with my previous phone, OPO with its larger battery and lower screen resolution. A bit less though, admittedly. So it gets me through the day nicely, but I'm not gaming.
It's average I think. Compared to the amazing Sony z3 compact it's pathetic, my friend's z3c gets 3+ days and 5 hours of screen on time. And I think it's done with some great stock kernel enchanted by Sony engineers and Sony stock rom... wish other phones could do that
Not sure why, I could not choose no lock screen and lock screen without security mode under security settings. Both options are greyed out and it says it is disabled by administrator,encryption policies or other apps.
I only have prey and lockout security installed which could affect this but this was totally fine on my M7 device.
Anyone has any idea? Thank you.
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
JellyKitkatLoli said:
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
Click to expand...
Click to collapse
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Yes, and it's fairly accurate.
I got 11871 with 2K, I couldn't screenshot as it doesn't know how to (Doesn't work above 1080p without dedicated software for it).
M8 1080p.
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
JellyKitkatLoli said:
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
Click to expand...
Click to collapse
hmmm......
The efficiency of SW decode will be performed vividly by multi-cores in your real life.
There were much comparison between MT6795(not "T") and SD801 in this site.
About the browser speed, RAR compression ,real gaming, power consumption........
http://tieba.baidu.com/p/3825199511?see_lz=1#69839534914l
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Are those results after reducing resolution to 1080p, did I read correctly? Because if that's the case, it's sad to confirm the limits of the 3D graphic card as other stated before, if you see the above M8 (which I find surprisingly higher than I experienced on my own M8 anyway) you'll notice even the last year's M8 has a better 3D score, and not to mention the regular M9 which produces almost double of that score. Multicore wise the Helio X10 is a beast, but graphics wise, looks like it's not so much
The rumoured SoCs don't hold up to the competition (and by competition I obviously mean Samsung and Apple)
How are they going to make SD810 look good at their conference? (or will they not talk about hardware performance at all...) Either way, if it's an SD810, it's likely to get destroyed in reviewers benchmarks.
The SD810 is much slower than the latest Exynos, and far, far, far slower than Apple's new A9 chip (it is probably even worse than Apple's year old A8).
Many of us were hoping for either SD820 or Kirin 950... But there are so many people confirming SD810 ....
I'm not super happy with my Nexus 6, but SD810 doesn't seem like much of an upgrade
Personally, I would rather they just keep the price down rather than engage in the ever ridiculous spec war. An 810 would be more than enough for a high end phone. I'm on an HTC M7 with an SD600 and it is still quite fast.
NikAmi said:
Personally, I would rather they just keep the price down rather than engage in the ever ridiculous spec war. An 810 would be more than enough for a high end phone. I'm on an HTC M7 with an SD600 and it is still quite fast.
Click to expand...
Click to collapse
Absolutely. I, too am still rocking this M7. It's no quitter by any stretch of the imagination! I just want stock Android and guaranteed timely updates, which this phone will definitely provide. Additionally, it looks like the M7 as well! As long as there isn't a ridiculous camera bump and it's just an area of the phone with different materials used (for radios and other gadgets that can't pierce through aluminum), I'd be sound as a pound. Besides, with the incredible performance rumors marching the internet (it's apparently FOUR TIMES faster than last year's Nexus 6), I think it's safe to say that this phone will be in my pocket for many years to come.
I remember an AnandTech article talking about the price of SoCs. they said that a high end SoC costs less than $30… and the low end are $10....
I don't think Google chose the SD810 because it was cheap. They chose it because there are very few options.
Apple doesn't sell their SoCs. Samsung doesn't sell much. Certainly not to real competition.
Nvidia can't do a SoC at low power. That leaves Intel, QCOM and some of the Chinese brands.
The Chinese brands may not be chosen because the Nexus needs to get approved quickly by lots of carriers. US carriers are quicker to approve QCOM
I would happily pay an extra $10-20 for a top or the line SoC
NikAmi said:
Personally, I would rather they just keep the price down rather than engage in the ever ridiculous spec war. An 810 would be more than enough for a high end phone. I'm on an HTC M7 with an SD600 and it is still quite fast.
Click to expand...
Click to collapse
The 808 is also very quick, so what's important though is optimization (as well as I/O, RAM type, and LTE/WiFi speed). I imagine even the SD600/S4P will continue to be useful for a few more years depending on how much of a burden future Android releases become.
Specs are good but not very useful if the software isn't on par. There's a reason why even phones like the G4, OP2, or S6 can show lag.
Sent from my LG-H950
Ace42 said:
The 808 is also very quick, so what's important though is optimization (as well as I/O, RAM type, and LTE/WiFi speed). I imagine even the SD600/S4P will continue to be useful for a few more years depending on how much of a burden future Android releases become.
Specs are good but not very useful if the software isn't on par. There's a reason why even phones like the G4, OP2, or S6 can show lag.
Sent from my LG-H950
Click to expand...
Click to collapse
Totally agree. I just want both. We had both with the N4, N5, N6. All three of then technically had the very best SoC available at the time (at least if you don't count Apple).
The SD810 was not the 'best' even when they launched 6 months ago, which is rare for Qualcomm. I'm surprised they estimate the SD820 won't be until next year, because that means the entire 2015 has been a QCOM disaster.
If the rumours are true, these Nexuses will be a little bit of a letdown for me in the SoC department.
Just look at 2015. Samsung dropped them, and their Exynos 7420 was far superior to the SD810. And now Amazon just announced their new Fire TV has dropped QCOM, and is using a top end MediaTek with the new A72 cores!
SyXbiT said:
Totally agree. I just want both. We had both with the N4, N5, N6. All three of then technically had the very best SoC available at the time (at least if you don't count Apple).
The SD810 was not the 'best' even when they launched 6 months ago, which is rare for Qualcomm. I'm surprised they estimate the SD820 won't be until next year, because that means the entire 2015 has been a QCOM disaster.
If the rumours are true, these Nexuses will be a little bit of a letdown for me in the SoC department.
Just look at 2015. Samsung dropped them, and their Exynos 7420 was far superior to the SD810. And now Amazon just announced their new Fire TV has dropped QCOM, and is using a top end MediaTek with the new A72 cores!
Click to expand...
Click to collapse
Unlike previous years Qualcomm didn't have their personal architecture (Kyro or whatever) prepared for 2015, so the SD810 felt like more of a placeholder. The thermal issues are likely a side effect of them using standard A57/A53 cores, they usually rely on custom architectures like Apple.
The SD820 according to QC has a bunch of improvements however, I'm unsure of whether it can beat the next Exynos or A9x.
I haven't checked out the new Kindles, but if they'll use A72's that's pretty good considering their HDX used the SD800.
Sent from my LG-H950
No they won't talk about performance. They know the 810 is a bad chip. They've most likely already throttled it or will do so soon afterwards and it'll still overheat, just like all the others.
TransportedMan said:
No they won't talk about performance. They know the 810 is a bad chip. They've most likely already throttled it or will do so soon afterwards and it'll still overheat, just like all the others.
Click to expand...
Click to collapse
New benchmarks showed up yesterday on backbench and they were typical 810 processor... Something in the lines of 1300 single, 4400 multi cores... Weak.
Sent from my SM-N920V using Tapatalk
2swizzle said:
New benchmarks showed up yesterday on backbench and they were typical 810 processor... Something in the lines of 1300 single, 4400 multi cores... Weak.
Sent from my SM-N920V using Tapatalk
Click to expand...
Click to collapse
Well those scores are still higher than any other Snapdragon at the moment. It's no E7420, but its the next best thing behind it. We also can't forget if this is the revised SD810 its been throttled to deal with its heating issues so the scores could possibly be higher. I don't care what the scores on paper say. All I want to know is are the heating issues fixed because some chips who have new 810 are still overheating.
2swizzle said:
New benchmarks showed up yesterday on backbench and they were typical 810 processor... Something in the lines of 1300 single, 4400 multi cores... Weak.
Sent from my SM-N920V using Tapatalk
Click to expand...
Click to collapse
a single benchmark doesn't matter, you have to look at a sustained performance. I could just take a phone out of a fridge to run a benchmark and I can guarantee you it'll be amazing, but if I run the same benchmark several times continuously, the score will be significantly lowered. the key here is whether can google/huawei do something to keep the continuous performance. let's say if I run the benchmarks 5 times in a row, how much deviation will there be between the first and last one? that's the important thing here. typical SD810's performance isn't bad, it's the throttling that everyone hates