I posted this in the EPIC forum, but think it's useful here too. I've been a long time phone hacker and recently came over to the HTC EVO which i'll be returning soon.
My vast researching has told me that the Droid X is the fastest phone to come out in the next 30 days. But for those that can't get to Verizon like me, here's my breakdown of research on all galaxy phones and the pro's and cons of the carriers.
I have a formatted file, but unfortunately forum rules seem to forbid html, so here is my text conversion.
<edit> i will use your comments to update this file. <edit>
=-=-=-=-Tmobile Vibrant:=-=-=-=-=
PROS
----
-1st to release (15th?)
-Cheapest to possibly own. Buy the contract phone for $199, opt out, pay the ETF for $200. For $200 more than contract you'll have it out right and can join Tmobile's contract free plan for $59 (all inclusive). For Highest use of data and text, this is the plan to get.
-built in memory for storage of 16GIG
-High speed internet capable (4G speeds)
-ETF of just $200
-Most similar to stock (may reduce update times) - Pika007
CONS
----
-no flash camera
-no front camera
=-=-=-=-ATT captivate:=-=-=-=-=
PROS
----
-ATT plan is most flexible. Can pick and choose from $15 data plan if necessary. $5 text for 200 msgs if you don't use that alot. If high street use is not critical, brings costs equal to tmobile or even the cheapest overall if you have a corporate discount which ATT regularly gives.
-same hardware pros and cons as tmobile vibrant version
CONS
-----
--Same negatives as Tmobile.
--can be expensive if highest use is needed.
--ATT network is pretty bad. been an iphone customer for years and it does suck.
--ETF of $325
=-=-=-=-Verizon Fascinate:=-=-=-=-=
PROS
----
-Network typically has high corporate discounts (i have %25 off). with discounts can become one of the cheaper plans.
-Best network overall in quality of calls.
-Hardware : Has flash camera, 1 gig more internal storage than similar EPIC.
-No keyboard can be a positive is size and weight is an issue.
CONS
-----
-Verizon plan can be the most expensive
-Droid X possibly has better hardware. It has equal video and most likely faster TI processor. Bigger screen too if you liked the EVO.
-Family plans suck.
-ETF of $350 (premium phone)
=-=-=-=-Sprint EPIC:=-=-=-=-=
PROS
----
-Sprint typically has corporate discounts (25% here)
-EPIC is the most feature rich of the samsung family.
-Keyboard
-Sprint plans usually include Sprint TV, which is nice.
-ETF of $200
-best family plans in costs, if you want all smartphones.
-Very fast data from what i can tell on my EVO. But i came from ATT.
-30 day guarantee
-Only phone to have the LED for notification and charging -mccune
CONS
-----
-if you don't have a discount, plans will be the most expensive. Sprint will charge $10 a month for having a 4G phone even if you don't live in 4G area.
-Sprint coverage is so-so in many cities
-Big phone, thick and heavier than the rest.
-smallest amount of built in storage (1 GIG)
I think the Vibrant is the most cost-effective, and being the most similar to the korean M110s and the worldwide-spread i9000 is a plus point for "coocked" content.
Also, i have to comment-
"-Droid X possibly has better hardware. It has equal video and most likely faster TI processor. Bigger screen too if you liked the EVO."
Well, no.
The Droid X uses a TI OMAP 3640 (series 3) processor clocked at 1GHZ.
From a processor standpoint, it's head-to-head with samsung's hummingbird.
From the graphics standpoint- the TI OMAP 3XXX series uses a PowerVR SGX530, just like the original motorola droid, only this time it's clocked at 200mhz and not 118.
The hummingbird in the Galaxy S uses PowerVR SGX540, which is more than twice as strong than the 530.
TI implemented the SGX540 in their soon-to-be OMAP4XXX series.
The only thing the droid x has and the galaxy s doesn't is an ISP, which except for benchmarks or apps that use several thousands of animations at the same time (i'd like to see something like that) only saves some battery when using 2D apps, since it doesn't fire up the more power-hungry GPU.
Thanks for taking the time to post this info. However, I just had to comment on the $10.00 Sprint charge. That charge gives you unlimited data on not only 4G but 3G as well. Data is not truly unlimited on Sprint unless you have a plan (Evo currently) with that required $10.00 fee.
Also I have an Evo that I'm returning and I'm just wondering why you're returning yours?
I'm returning the EVO, because it's out-dated already. After the sammy's and the droid x, it will be the 5th in Android. All in less than a month.
But that's not all, I hate the battery life. I have to drag the power cable with me everywhere. If i don't charge it overnight, it will be 50% charged by morning and will die at some point during my work hours.
The interface is slow, and not crazy about the heavy HTC sense. Iphone spoiled me. All video games on this are horrible. Aspalt, guitar hero, and motoxmayhem are nearly unplayable.
Quadrant again?
People are really ought to stop using that as a general "who is better" test, and just pick specific details from it.
It takes too many factors and mushes them into a single "final score"
One of these factors is independant 2D acceleration, which in you get a BIG ROUND ZERO if your device doesn't have an ISP. It impacts on the final result greatly, while in real life, it doesn't really matter.
Also, i really don't like that article ("hummingbird vs. snapdragon"). The writer is not basing on anything, and is making a mountain of speculations.
Bottom line:
The TI OMAP3640 CPU is an Arm Cortex [email protected] with tweaks by TI, made in 45nm.
The Hummingbird CPU is an Arm Cortex [email protected] with tweaks by samsung, 45nm.
The differences between those two is almost null.
The Droid X uses SGX530 at stock frequency (200mhz)
The Galaxy S uses SGX540 at stock frequency, or higher. Not yet known.
Both devices use 512MB LP DDR2.
You can see the differences clearly, and untill we won't have specific benches for every part of those system, we won't be able to compare properly.
About the CPU, the current best way to compare is run Linpack on both devices. Measure pure raw processing power (output @ mflops)
Edit- Oh, hey, there are already linpack results.
http://www.greenecomputing.com/apps/linpack/linpack-by-device/
You can see some real surface droid X results. Avaraging at ~8.35 mflops.
I just downloaded Linpack to my SGS, avaraging at ~8.58mflops.
Like i said, the difference is null.
Pika007 said:
THE TRUTH
Click to expand...
Click to collapse
Thank you!
It's painful seeing those silly benchmarks (also outdated).
Droid X and SGS are both great devices that will not disappoint (except for certain SGS users who are hell-bent on making their SGS seem like trash).
Pika007 said:
Quadrant again?
People are really ought to stop using that as a general "who is better" test, and just pick specific details from it.
It takes too many factors and mushes them into a single "final score"
One of these factors is independant 2D acceleration, which in you get a BIG ROUND ZERO if your device doesn't have an ISP. It impacts on the final result greatly, while in real life, it doesn't really matter.
Also, i really don't like that article ("hummingbird vs. snapdragon"). The writer is not basing on anything, and is making a mountain of speculations.
Bottom line:
The TI OMAP3640 CPU is an Arm Cortex [email protected] with tweaks by TI, made in 45nm.
The Hummingbird CPU is an Arm Cortex [email protected] with tweaks by samsung, 45nm.
The differences between those two is almost null.
The Droid X uses SGX530 at stock frequency (200mhz)
The Galaxy S uses SGX540 at stock frequency, or higher. Not yet known.
Both devices use 512MB LP DDR2.
You can see the differences clearly, and untill we won't have specific benches for every part of those system, we won't be able to compare properly.
About the CPU, the current best way to compare is run Linpack on both devices. Measure pure raw processing power (output @ mflops)
Click to expand...
Click to collapse
Alright, i'll remove references here to those articles. I want to keep this post really related to feelings on which carrier options benefit the most / least.
Hey, thanks for the great post and info. ID like to say thanks to other contributers as well. This thread helped loads.
Thanks, great info!
seems like SGS i9000 is the better choice (tech spec), i didn't like the shell design of the Motorola Droid X
the 8Mpix camera and 4G on the Droid X difference vs. the SGS 5Mpix and 3G is not that much of set back.
still the fastest!
http://androidandme.com/2010/07/news/galaxy-s-lineup-leads-the-pack-in-android-gpu-benchmarks/
This is actually a very subjective comparison. Most pros and cons are about the network or price.
About the hardware you could add that the Samsung Epic (4G) is actually the only device that seem to have an notification LED (also for charging).
mccune said:
This is actually a very subjective comparison. Most pros and cons are about the network or price.
About the hardware you could add that the Samsung Epic (4G) is actually the only device that seem to have an notification LED (also for charging).
Click to expand...
Click to collapse
I added your point. I'm not really sure which phone i'm getting. Some of the chats here are pushing me away from DroidX, but i'm still open to it, the iphone4, One of the Sammy's.
I think one of the bigger weights to the formula that i didn't put in is how each network does in reception in your area (at work , home, and to and from). But if that's all equal, as it is for me, then i become open to anything. For me, i'm leaning towards the Epic a little, but when Tmobile releases it's phone, it'll be hard to resist on that $59 no contract plan, especially if Sprint hasn't released a date yet.
keep in mind theres still the Bell version, which is on AT&T 3G frequencies for those wondering.
http://galaxys.bell.ca/
looks pretty much exactly like the Euro version
sundip said:
still the fastest!
http://androidandme.com/2010/07/news/galaxy-s-lineup-leads-the-pack-in-android-gpu-benchmarks/
Click to expand...
Click to collapse
i like that, our phone has the Overall better performer.
the moto droid X might be faster in HD, but it is slower on other tasks
Ziostilon said:
keep in mind theres still the Bell version, which is on AT&T 3G frequencies for those wondering.
http://galaxys.bell.ca/
looks pretty much exactly like the Euro version
Click to expand...
Click to collapse
don't go for Bell, it's a CDMA network, you'll wont be able to use it much for travel to areas without CDMA services
AllGamer said:
i like that, our phone has the Overall better performer.
the moto droid X might be faster in HD, but it is slower on other tasks
Click to expand...
Click to collapse
Actually, GLbenchmark tests graphics only, and cpu performance in graphics-bound operations.
GLbenchmark 1.1 is devided to pro and HD because 1.1 came out when no HD-display phones were around.
GLbenchmark 2.0 operates in "HD" as well.
Related
Looks like HTC has done it again and delivered a phone that should run crazy fast on paper BUT the actual performance is sub-par compared to other phones:
HTC Nexus One (FAILphone):
http://www.youtube.com/watch?v=hvzxZ8tOBcQ
HTC Magic and HTC Liquid Benchmark:
http://www.youtube.com/watch?v=O36LA6EhZg4
I don't think that Neocore benchmarks the entire system, maybe more on the graphics chip. I don't know any specifics on the N1's graphics capabilities, but the 1 ghz snapdragon cpu is a definite boost from its predecessors.
Do you work for Apple?
How does it do on PiBenchmark? That would provide more relevant results with its Snapdragon processor.
andythefan said:
I don't think that Neocore benchmarks the entire system, maybe more on the graphics chip. I don't know any specifics on the N1's graphics capabilities, but the 1 ghz snapdragon cpu is a definite boost from its predecessors.
Click to expand...
Click to collapse
doesn't the liquid come with an underclocked snapdragon?
I have a Magic that is rooted and tweaked to all hell and have played with the nexus. There is no doubt that the Google phone out performs any other phone that HTC has released. Ive seen it first hand. Its very fast and can handle so many things going on at the same time it makes my tummy tickle.
You are an idiot. Get your panties out of a bunch because you are pissed at the price and that it has no AT&T 3G. Should we all be pissed that the Droid only works on Verizon? Should we all be pissed that the iPhone only has AT&T 3G? The Nexus One is designed to be on T-Mobile. Sure, it will technically work on any GSM provider, but that isn't what it was intended to do. Google must have some deal with T-Mobile since they offers the most android phones.
And about the performance, that only shows video performance, and we dont know for sure what the N1 and A1 have in terms of a GPU
staulkor said:
And about the performance, that only shows video performance, and we dont know for sure what the N1 and A1 have in terms of a GPU
Click to expand...
Click to collapse
I thought neocore tested the graphics chip with 3d benchamarking?
andythefan said:
I don't think that Neocore benchmarks the entire system, maybe more on the graphics chip. I don't know any specifics on the N1's graphics capabilities, but the 1 ghz snapdragon cpu is a definite boost from its predecessors.
Click to expand...
Click to collapse
It's called system on a chip.
and the telling comparison is the Acer Liquid with its ~750MHz Snapdragon CPU (underclocked) vs. the Nexus One with its 1GHz Snapdragon CPU.
Looks like HTC screwed up again.
Ohhh. The other posters are pissed because their Messiah phone is a big FAIL?
What are you, 15 years old? Get off of mommy's computer and stop *****ing because you can't use the N1 on your network and get 3G.
Im guessing the benchmark isnt accurate. It goes beyond common senese that the fps are the same as the magic.
Maedhros said:
Im guessing the benchmark isnt accurate. It goes beyond common senese that the fps are the same as the magic.
Click to expand...
Click to collapse
Actually ... it goes nicely with HTC's track record of under-performing hardware.
We have too many variables that makes comparing these results difficult. The HTC Magic and Liquid are running 1.6, while the Nexus is running 2.1. There are dramatically different levels of overhead on different Android system versions. There could be way more overhead on Android 2.1 than on 1.6. Additionally, you forgot to mention that the Nexus One is running at a resolution 2.5 times that of the HTC Magic.
Just because you're not going to buy the Nexus (because you recently purchased another handset and are trying to justify your purchase, or because it doesn't support your carrier's 3G frequencies, or otherwise) doesn't mean you are obliged to spam these forums with "OMG THIS PHONE IS FAIL"
the resolution used on the n1 is far higher than on the older devices remember
coolVariable said:
It's called system on a chip.
and the telling comparison is the Acer Liquid with its ~750MHz Snapdragon CPU (underclocked) vs. the Nexus One with its 1GHz Snapdragon CPU.
Looks like HTC screwed up again.
Ohhh. The other posters are pissed because their Messiah phone is a big FAIL?
Click to expand...
Click to collapse
The only FAIL here are your posts. You sound like a Droid owner, pissed that your phone is about to lose top dog status. Just crawl back into your parents basement, fire up your xbox, and shoot some 12 year olds. It will help you get over the fact that you are a huge FAIL.
lol @semantics now thats funny man
I have had this phone for three weeks now and one thing its not is SLOW. Its way faster than my 3GS and my Mytouch.
I got 27.4 FPS on my G1.
I'm pretty sure the N1 isn't slower then the G1. That would be stupid.
I don't give a damn, I'm buying this joint day 1!! LOL
my theory:
1. Neocore is designed to work with android 1.6 and Open GL ES 1.1
2. The Liquid A1 has the same processor (albeit underclocked) and the same screen resolution as the N1 so you would expect them to perform similarliy. They dont perfrom the same so you must look at the differences between the phones. The biggest to me is the fact that the Liquid A1 has Android 1.6 and Open GL ES 1.1, the sweet spot for Neocore.
3. The N1 had Android 2.1 and Open GL ES 2.0, specs that are not supported by Neocore. How can Neocore accurately test the N1 when it does not support its specifications? The slowness is not due to poor hardware, rather it is due to old software trying to run on the latest hardware.
i have heared that galaxy s Gpu can give 90M triangles/sec is that true as some sources claming that it only gives 28M tri/sec http://en.wikipedia.org/wiki/PowerVR , and the higher one sgx 545 gives 40 m so how the sgx 540 gives 90M
hoss_n2 said:
i have heared that galaxy s Gpu can give 90M triangles/sec is that true as some sources claming that it only gives 28M tri/sec http://en.wikipedia.org/wiki/PowerVR , and the higher one sgx 545 gives 40 m so how the sgx 540 gives 90M
Click to expand...
Click to collapse
I don't think the number listed on wikipedia is 'triangles' per second... It just says polys... So it could be a different shape thats harder to render?
Just my guess.
Besides if the 90M claimed is actually the 28 million then don't worry because the same thing for the iPhone's GPU (the 535) claims around 22m and wiki is listing it as 14.
Aaannnnddd if you're worried about the GPU feel comforted that no 3D benchmarks I've seen have even slowed it down so far and you can see tons of videos on youtube of Galaxy S series phones face rolling every single other Android device in gaming FPS benchmarks. Even if it isn't as amazing as the numbers they claimed there is no doubt that it's the best in the market at the moment, and by quite a lot too!
I'm not going to pretend that I read the comment thoroughly, but I've read a similar question. The person who seemed to know what they were talking about, said that essentially the 90m is a "theoretical number" and that about half of that number is what the phone should? can? will? potentially? do...(skimming, memory and probably comprehension make that a very difficult word to fill in accurately)....but this is how all manufacturers report their graphics capabilities (at least in smartphones, but I'll assume the same holds true for the desktop/laptop graphics cards).
So, while the number is definitely overstated, it's within the standard reporting convention...and relative to other numbers, still accurate (2x as many triangles is 2x as many whether everything gets cut in half of cut by a factor of 3).
*I'll remove my fuzzy language when someone better informed than me responds*
I also read a good article (don't know where it is now sorry) all about how the GPU relies heavily on the memory and bus between them etc and for example there could be a phone running the same GPU as another and have much less performance because they don't use much memory, or use slow memory. Apparently our SGS have done pretty well in all departments.
To untangle the confusion-
Triangles= "polys" (polygons)
The SGS does nothing bear 90M, but on the other side, none of the other phones are doing what the manufacturers are claiming them to do.
Plus, the wikipedia article is FAR from being reliable, it's been edited more than 5 times over the past 2 months, with ever changing results. No official specs are available from imgtec.
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Pika007 said:
...
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Click to expand...
Click to collapse
Well, one important fact is the pixelcount in the glbenchmark link you sent. iPhone4 and iPad share the same GPU. The difference in pixels is about 20%, and hence the difference between those two.
Let me make one ugly calculation to map SGS's score to iPhone4's. Pixelcount difference between i4 and SGS is a factor 0.625. That we would make the SGS score 1146 on the iPhone resolution. (or 1723 for i4 on 800*480 resolution). Offcourse there are more factors involved but this the best estimate i can make at the moment.
Difference turns out not te be that great after all.
I knew this argument was going to pop up soon enough, so i'll add one VERY important factor-
Score doesn't decrease proportionally to an increase in resolution.
For example, doubling the resolution won't give half the score. More like 70%~
Try running 3Dmark on your PC in different resolutions, you'll see some interesting results.
Personally, GLmark 1.1 for me is just a very crude example, for general demontstrations. It's not really close to be very accurate.
I'm waiting for GLmark 2.0 that should be a great tool to effectively compare the devices.
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Pika007 said:
To untangle the confusion-
Triangles= "polys" (polygons)
The SGS does nothing bear 90M, but on the other side, none of the other phones are doing what the manufacturers are claiming them to do.
Plus, the wikipedia article is FAR from being reliable, it's been edited more than 5 times over the past 2 months, with ever changing results. No official specs are available from imgtec.
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Click to expand...
Click to collapse
yes it is edited more than 5 times but there is an offcial sources says that sgx 454 gives only 40M polygons so hw sgx450 gives 90M i know numbers are not important if there is nothing to use it but i only wanted to know
I think its due to fact that older chip has 2d acceleration too, while 450 is pure 3d and we use cpu for 2d. Thats why its faster.
It is important to note that PowerVR does not do 3D rendering using the traditional 3D polygon based pipeline, like those used in nVidia and ATi cards. It uses the unique tile based rendering engine. This approach is more efficient and uses less memory bandwidth as well as RAW horse power. IIRC, the original PowerVR 3D PC card is a PCI card that can compete head to head with AGP based cards from 3dfx and ATi at that time. Unfortunately, its unique rendering engine does not fit well with Direct3D and OpenGL which favor traditional polygon-based rendering pipelines.
So, the 90M figure could well be the equivelent performance number when using traditional 3D rendering pipeline as compared to Tile-based PowerVR setup.
Power VR does indeed use the traditional 3D polygon based pipeline.
Tile based rendering is in addition, not instead.
Do note that not all games (and actually, far from it) are using TBR properly (if at all).
Read the release notes and press release, it has enough details.
hoss_n2 said:
yes it is edited more than 5 times but there is an offcial sources says that sgx 454 gives only 40M polygons so hw sgx450 gives 90M i know numbers are not important if there is nothing to use it but i only wanted to know
Click to expand...
Click to collapse
All the given numbers for "official" specs about PowerVR GPU's are for a frequenct of 200mhz.
Those chips can do well above 400mhz, so for example, if an SGX530 does 14M polygons and 500Mpixels per second @200mhz, if you clock it up to 400, it'll do 28Mpolys/1Gpixels.
Though i extremely doubt samsung has the SGX540 clocked at 600mhz in ths SGS...
A pratical and good exaple that shows of the power of the Galaxy S is Gameloft's Real Football 2010 game. The game hasn't got a framelock so it's playable on the Desire and Nexus One. Since pictures tell a thousand words and videos even moreso, I'll provide you this YouTube link: http://www.youtube.com/watch?v=S0DxP0sk5s0
Pika007 said:
All the given numbers for "official" specs about PowerVR GPU's are for a frequenct of 200mhz.
Those chips can do well above 400mhz, so for example, if an SGX530 does 14M polygons and 500Mpixels per second @200mhz, if you clock it up to 400, it'll do 28Mpolys/1Gpixels.
Though i extremely doubt samsung has the SGX540 clocked at 600mhz in ths SGS...
Click to expand...
Click to collapse
This is true however overclocking the GPU to those numbers is silly because the memory & memory bus can't support that much data throughput anyway. I don't even think there is enough to support the amount of the standard clock rate. There is a lot more to consider than just the GPU when it comes to graphics here
You're taking that article you read way too seriously.
Plus, we have no idea what is the bandwidth limit of the galaxy S, we don't know what kind of memory is used, how much of it, at what frequency, etc.
WiseDuck said:
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Click to expand...
Click to collapse
+1
Re: lag, I want doing bad until I installed one of the fixes. Now I've officially entered crazy-town.
If I would have to guess it has to do with S5PC110 optimizations. When rendering polygons there are many things that contribute aside from the GPU. Think of it maybe similar to hybrid-sli...(but this is just a guess)
but if you want to look at it in more detail, someone posted the official documentation and spec sheet for the S5PC110 a while back..I ddint get a chance to look at it but my guess the clock speeds and other stuff would be there :/
WiseDuck said:
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Click to expand...
Click to collapse
Well i dont have any lags, what so ever after lag fix. Something else must be troubleing your phone. Auto memory manager is a need tho if you want to keep it real snappy.
Sent from my GT-I9000 using XDA App
I know this might be a touchy topic to start, but I was seriously considering the Milestone 2 before the G-2 was announced. And when I heard that the G2 wasn't even 1GHz, I was pretty set on the Milestone 2. Then the benchmarks came out: http://www.gizmodo.com.au/2010/09/graphics-benchmark-for-g2-shows-it-should-be-blazin/
so now I am super confused.
Milestone 2 - Pros:
1GHz processor (if the Milestone can be OCed to 1GHz, who know what the Milestone2 could be OCed to?!)
Umm....that's all I think
Milestone 2 - Cons:
Locked bootloader (what does this mean exactly? no custom ROMs?)
MotoBLOATware (means slower Android updates)
Same camera as Milestone 1 (seriously...)
Motorola
G-2 - Pros:
HTC
Stock Android
Faster Android updates
Supposed to be good for Android Gingerbread as well
No locked bootloader (more custom ROMs?)
Not Motorola
G-2 - Cons:
800MHz Processor
Didn't perform as well as Droid 2 in stock benchmark tests
Less internal memory (4Gb vs 8Gb) - not a big deal I guess
Hinge design probably means it's easier to break
Does anybody think that the development for the G-2 would be larger than the Milestone 2? Because that would be a huge Pro for the G-2. The G-1 development has lasted for years, and the user base is huge...so I am just going to assume that it would be similar for the G-2.
I'm not going to mention aethetics, because this is very subjective so no point arguing on that front.
So here's my question: Which one should I get?!
Hmm, tough one really.
HTC Camera's are not much better than the ones motorola sticks in them to be honest! The droid has crapware on it and the HTC comes with stock sense.
It seems the processor performance is near identical, in real world application use atleast so thats a non-issue i.e. when one beomces outdated, so will the other.
In terms of mod community, the HTC phone is likely to get much more support on this website, simply because the majority of people here are HTC users or past HTC users.
The Milestone seems a bit more manly, rough and the HTC looks more refined IMHO.
Personally, I'd go the G2 if I wanted as much modding as possible. HTC has a much more open policy on the topic and no locked bootloaders (efuse etc).
At the end of the day, its your decision
Have a play with both phones and pick the one you feel looks better. A good question to ask is:
I know they both look nice now, but which one will look worse for wear (paint peeling, scratching etc)
yeah, I was leaning a little more towards the G-2 due to the massive potential of modding/custom ROMs available. And i'm also quite sure that the G-2 will become the new platform for development, much like the G-1 was.
The 800MHz still bugs me though...considering that Qualcomm has new chipsets that are supposed to be able to go up to 1.2/3GHz whilst also running a GPU chip. I wouldn't wait until dual-cores because apparently that won't happen until next year (probably late next year).
HTC are supposed to be doing a pretty major announcement in London on the 15th September (so we'll hear about it on the 16th)...so hopefully that might shed some light on it. Of course, if the G-2 is the only QWERTY option, then that would probably limit our choices.
...if only they had made that a 1GHz processor...
I wouldnt rate the 1ghz feature that highly. Perhaps 800 mhz is an underclock to enhance battery life. After all, the benchmarks say performance is on par with the top phones.
99.9999% guaranteed that you'll be able to overclock the G2 to at least 1ghz
IMO the onlything that the M2 has over the G2 is how far u can OC the processor. those A8's can go far.
I dunno. clock speed is overrated. you don't know what performs better in the real world until you see some benchmarks.
That's true. Apperantly I read that a 800mhz Droid 1 performs on par with a n1. The a8 can be oc to all hell.
I definately am a fan of TI chips. They are mighty strong and it always seemed that they always had a leg up on qualcomm. I hope qualcomm stomps everyone with some massive processor that'll make the hummingbird cry
The g2 is under-clocked according to the press release.
Mylenthes said:
The g2 is under-clocked according to the press release.
Click to expand...
Click to collapse
I heard. At the end of the day a underclocked processor that runs as fast as a snapdragon and wastes less power is a win in my book!
I have read that the ARM procs are better with GPUs and for overclocking - just look at the Milestone 1. The Droid 2 was still benchmarked as the best (but only just over the G-2) and it's potential for OCing was what swayed me that way.
But now that these new Qualcomm chipsets have a separate GPU that seem to be able to compete with the best in benchmarking - it's hard to say.
If the G-2 can be OCed, then that would be awesome...but then, the Milestone 2 is also about 99% sure to be OCed...as I'm sure previous Milestone 1 owners will be screaming for it as soon as they get their hands on the new one.
skulk3r said:
I have read that the ARM procs are better with GPUs and for overclocking - just look at the Milestone 1. The Droid 2 was still benchmarked as the best (but only just over the G-2) and it's potential for OCing was what swayed me that way.
But now that these new Qualcomm chipsets have a separate GPU that seem to be able to compete with the best in benchmarking - it's hard to say.
If the G-2 can be OCed, then that would be awesome...but then, the Milestone 2 is also about 99% sure to be OCed...as I'm sure previous Milestone 1 owners will be screaming for it as soon as they get their hands on the new one.
Click to expand...
Click to collapse
Im confused. I thought the nexus had a seperate gpu?
sheek360 said:
Im confused. I thought the nexus had a seperate gpu?
Click to expand...
Click to collapse
as far as I know, it does, but either the GPU isn't fully utilized in the use of Android or it's just a underpowered - just see the benchmarking results on the previous page that I posted, the Milestone 2 and G-2 smoke the N1 (which is expected, since the N1 is much older)
skulk3r said:
I know this might be a touchy topic to start, but I was seriously considering the Milestone 2 before the G-2 was announced. And when I heard that the G2 wasn't even 1GHz, I was pretty set on the Milestone 2. Then the benchmarks came out: http://www.gizmodo.com.au/2010/09/graphics-benchmark-for-g2-shows-it-should-be-blazin/
so now I am super confused.
Milestone 2 - Pros:
1GHz processor (if the Milestone can be OCed to 1GHz, who know what the Milestone2 could be OCed to?!)
Umm....that's all I think
Milestone 2 - Cons:
Locked bootloader (what does this mean exactly? no custom ROMs?)
MotoBLOATware (means slower Android updates)
Same camera as Milestone 1 (seriously...)
Motorola
G-2 - Pros:
HTC
Stock Android
Faster Android updates
Supposed to be good for Android Gingerbread as well
No locked bootloader (more custom ROMs?)
Not Motorola
G-2 - Cons:
800MHz Processor
Didn't perform as well as Droid 2 in stock benchmark tests
Less internal memory (4Gb vs 8Gb) - not a big deal I guess
Hinge design probably means it's easier to break
Does anybody think that the development for the G-2 would be larger than the Milestone 2? Because that would be a huge Pro for the G-2. The G-1 development has lasted for years, and the user base is huge...so I am just going to assume that it would be similar for the G-2.
I'm not going to mention aethetics, because this is very subjective so no point arguing on that front.
So here's my question: Which one should I get?!
Click to expand...
Click to collapse
Well If I made the list I would make it more like this:
Droid 2
Pro's:
-More high res screen (854*480 vs 800*480, both on a 3.7" screen)
-8 Gigs internal storage (4 gb more than G2)
-1 Ghz TI 45nm processor (likely to overclock fairly well)
-1 oz lighter
Con's:
-MotoCRAP software (look at the scrolling on any phone with blur from the cliq to droid x and tell me it doesn't feel slow)
-keyboard lacking comparing it to the G2
-locked bootloader (harder to develop roms for, root still possible though)
-CDMA (yeah it sucks.... that's why noone outside the US uses it...)
G2
Pro's:
-Better screen (S-TFT lcd vs regular lcd on the droid/2; brighter, more contrast, better power consumption)
-720P camcoder at 30FPS
-Adreno205 GPU(4X better graphics than previous snapdragons)
-800 Mhz MSM7230 45nm (this shows it was designed to run at 1 Ghz, but underclocked to save battery. Similar to the MSM7201A on the G1, designed for 528 Mhz but only runs at 385 Mhz unmodified; should at least be overclockable to 1.13 or more)
-better keyboard (WWW.\.COM BUTTON? HELL YEAH!)
-HSPA+ (who doesn't want 14 MB/s DL?)
-GSM (gsm is always better....)
-likely to receive updates rapidly (no promises though, look at the 3 months it took motorola to update to 2.1 on their "google experience" device)
-New sturdy Z-hinge design
Con's:
-ummm.......
Yeah so what if the droid 2 gets an extra 3 frames (58 vs 61)... Your eye can only detect 50FPS unless you're a combat pilot or a sniper with trained eyesight. Besides that both phone's have pretty identical specs, including 512 mb ram, 2.2, 10 hrs talk time, etc. My vote is definately towards the g2, but don't get me wrong both are beasts of a phone. Also I wouldn't expect to see any dual core's until maybe next summer. Think about it: Qualcomm released the original snapdragon in Nov 2008, but it wasn't until Dec 2009 that a phone implemented it (LG Expo). The dual core qualcomm chips come out next month, so it should be 6 months-a year before they come out.
mejorguille said:
Well If I made the list I would make it more like this:
Droid 2
Pro's:
-More high res screen (854*480 vs 800*480, both on a 3.7" screen)
-8 Gigs internal storage (4 gb more than G2)
-1 Ghz TI 45nm processor (likely to overclock fairly well)
-1 oz lighter
Con's:
-MotoCRAP software (look at the scrolling on any phone with blur from the cliq to droid x and tell me it doesn't feel slow)
-keyboard lacking comparing it to the G2
-locked bootloader (harder to develop roms for, root still possible though)
-CDMA (yeah it sucks.... that's why noone outside the US uses it...)
G2
Pro's:
-Better screen (S-TFT lcd vs regular lcd on the droid/2; brighter, more contrast, better power consumption)
-720P camcoder at 30FPS
-Adreno205 GPU(4X better graphics than previous snapdragons)
-800 Mhz MSM7230 45nm (this shows it was designed to run at 1 Ghz, but underclocked to save battery. Similar to the MSM7201A on the G1, designed for 528 Mhz but only runs at 385 Mhz unmodified; should at least be overclockable to 1.13 or more)
-better keyboard (WWW.\.COM BUTTON? HELL YEAH!)
-HSPA+ (who doesn't want 14 MB/s DL?)
-GSM (gsm is always better....)
-likely to receive updates rapidly (no promises though, look at the 3 months it took motorola to update to 2.1 on their "google experience" device)
-New sturdy Z-hinge design
Con's:
-ummm.......
Yeah so what if the droid 2 gets an extra 3 frames (58 vs 61)... Your eye can only detect 50FPS unless you're a combat pilot or a sniper with trained eyesight. Besides that both phone's have pretty identical specs, including 512 mb ram, 2.2, 10 hrs talk time, etc. My vote is definately towards the g2, but don't get me wrong both are beasts of a phone. Also I wouldn't expect to see any dual core's until maybe next summer. Think about it: Qualcomm released the original snapdragon in Nov 2008, but it wasn't until Dec 2009 that a phone implemented it (LG Expo). The dual core qualcomm chips come out next month, so it should be 6 months-a year before they come out.
Click to expand...
Click to collapse
Well technically the Milestone 2 is a GSM device, not CDMA, but the other points are quite good.
However, don't forget that the Milestone 2 can also OC...so the two are comparable in that sense.
Didn't I read somewhere that Android doesn't yet support hardware acceleration for it's UI? Meaning that that spiffy new GPU will only get used in games/video, etc? I'm no expert on this - perhaps someone could chip in and explain...
I too am facing the same decision soon. I love my Desire, but I will never buy a keyboardless phone again, and will chop it in for one of these two as soon as I can.
I was leaning more towards the Milestone 2 as it looks badass in my opinion, and there is no major difference in the specs. That was til I read up about locked bootloaders, and the fact that Milestone 1 owners are still on Android 2.1 and Motorola just doesn't give a sh*t about the numerous bugs the phone has.
Just a quick search around forums/Facebook Motorola Europe page etc, shows how unhappy Milestone 1 owners are with Motorola. Page after page of people saying "I will NEVER buy Motorola again" and literally begging Motorola to unlock the bootloader before abandoning the phone (all met with a deafening silence from Motorola) does turn me off of the Milestone 2. Motorola won't fix the phone and won't give their customers the tools to fix it themselves, so as far as I'm concerned I'm not going to spend £400-500 to put myself in that same position with the Milestone 2.
On the HTC side of things, they do make attempts to stop people modding their phones, but have not yet gone as far as locking the bootloader, and every HTC phone has been compromised. I fully expect this to be the case with the G2/Desire Z (Desire Z - what a sh*t name!), and the phone WILL get a lot of dev support, no question.
Ultimately, I believe there is no choice for me: pain with Motorola, or fun with HTC/XDA devs!
I'm still not sure about the hinge action, nor do I like the looks of the G2 particularly, I think it's going to turn out to be a bit of a fat chunker! Still, I go for personality and functionality in my phone rather than looks, otherwise I'd have an iPhone! It's also a crying shame that HTC went with a 4 row rather than 5 row keyboard.... And one final request please HTC - make sure that screen is full multitouch please! Oh, and I have heard this bad-boy is going to have stock Android with HTC Sense widgets - that's all well and good, but what I'm interested in is the Sense Dialer. And the Sense browser text selection please - I want that available everywhere on the phone, please!
Uh oh - just read the G2 has just 1300maH battery.
HTC, what is this twattery? I don't want to go backwards with battery size! Yes I know, more efficient processor, blah blah, lower clock speed, yadda yadda - but I don't give a monkey's!
A smaller battery in a bigger handset than the original Desire is not good enough in my mind. The G2 is going to be a brick anyway - why not add 5mm to the length and give us an extra 200-300maH? Or perhaps if you'd used a more standard slider action then you could've fitted a beefier battery in there :-(
setspeed said:
Didn't I read somewhere that Android doesn't yet support hardware acceleration for it's UI? Meaning that that spiffy new GPU will only get used in games/video, etc? I'm no expert on this - perhaps someone could chip in and explain...
Click to expand...
Click to collapse
It's true, but the next version Android Gingerbread adds it, which is where handsets with weaker gpus will begin to struggle. Running AOSP means that it will get updates at the same speed of the Nexus one.
However, the Clove spec for the Desire Z has it as a Sense device. This is backed up with the earlier screenshot of the Desire Z with the default sense wallpaper, and the name (why would they give it Desire branding if it didn't have Sense?). So there's a chance they'll put in the 8X55 which is the 1ghz of the same processor, but doesn't support HSPA+ (which isn't in the UK). So we'd gain 200mhz in return for Sense and an uglier handset colour scheme.
UK HTC event is tomorrow, so we should find out then.
Adreno205 GPU, 4X better graphics than previous snapdragons.
I think it's the only one that has hardware accelerated Adobe Flash support.
I remember seeing Milestone 2 and other android phones reviews, reviewers said phone would literally crawl when browsing flash enabled websites.
So maybe that feature will make a big difference?
I think I've decided: G2
setspeed said:
Didn't I read somewhere that Android doesn't yet support hardware acceleration for it's UI? Meaning that that spiffy new GPU will only get used in games/video, etc? I'm no expert on this - perhaps someone could chip in and explain...
I too am facing the same decision soon. I love my Desire, but I will never buy a keyboardless phone again, and will chop it in for one of these two as soon as I can.
I was leaning more towards the Milestone 2 as it looks badass in my opinion, and there is no major difference in the specs. That was til I read up about locked bootloaders, and the fact that Milestone 1 owners are still on Android 2.1 and Motorola just doesn't give a sh*t about the numerous bugs the phone has.
Just a quick search around forums/Facebook Motorola Europe page etc, shows how unhappy Milestone 1 owners are with Motorola. Page after page of people saying "I will NEVER buy Motorola again" and literally begging Motorola to unlock the bootloader before abandoning the phone (all met with a deafening silence from Motorola) does turn me off of the Milestone 2. Motorola won't fix the phone and won't give their customers the tools to fix it themselves, so as far as I'm concerned I'm not going to spend £400-500 to put myself in that same position with the Milestone 2.
On the HTC side of things, they do make attempts to stop people modding their phones, but have not yet gone as far as locking the bootloader, and every HTC phone has been compromised. I fully expect this to be the case with the G2/Desire Z (Desire Z - what a sh*t name!), and the phone WILL get a lot of dev support, no question.
Ultimately, I believe there is no choice for me: pain with Motorola, or fun with HTC/XDA devs!
I'm still not sure about the hinge action, nor do I like the looks of the G2 particularly, I think it's going to turn out to be a bit of a fat chunker! Still, I go for personality and functionality in my phone rather than looks, otherwise I'd have an iPhone! It's also a crying shame that HTC went with a 4 row rather than 5 row keyboard.... And one final request please HTC - make sure that screen is full multitouch please! Oh, and I have heard this bad-boy is going to have stock Android with HTC Sense widgets - that's all well and good, but what I'm interested in is the Sense Dialer. And the Sense browser text selection please - I want that available everywhere on the phone, please!
Click to expand...
Click to collapse
waiting to see what new things HTC has "dreamt" up for their announcement of the 15th (in London)...but I don't expect things to be any different than what we know now.
yes, the battery is a crying shame, but tbh I charge my phone every night ever since I got my blackstone...and I also have a car charger...but it would still be nice to have a phone that would last more than 2 days.
I'm not sure I fully understand the bootloader stuff...I've seen youtube videos of people running something called a "Bugless Beast" ROM on an OCed Milestone 1....but I agree, Motorola are pretty bad with customer service and post-sales support. HTC, on the other hand, are pretty happy to turn a blind eye to the modding community - as all Android manufacturers should..since Android is technically an opensource platform.
Oh..also...I don't really care about the name Desire Z, lol.....just a name. They could call it "The Loser Phone" and I'd still probably get it
I have to go and pay my bill up to date tomorrow. I am very seriously thinking about the evo shift for obvious reasons. Does anyone have any thoughts on this subject or actually bought it? I'm interested in what you have to say.
herbthehammer said:
I have to go and pay my bill up to date tomorrow. I am very seriously thinking about the evo shift for obvious reasons. Does anyone have any thoughts on this subject or actually bought it? I'm interested in what you have to say.
Click to expand...
Click to collapse
I'm not really sure what those obvious reasons are. The EVO Shift 4G has a slower processor, worse GPU, smaller screen, LCD instead of SAMOLED, and on all other points save Android 2.2 just about comes even with the Epic 4G. It's an attractive phone, and it probably has reasonable build quality (haven't had one in my hands yet) but I fail to see why it would be worth switching from an Epic 4G for.
Trade the best phone on sprint for a midrange phone? GREAT IDEA. /s
Electrofreak said:
I'm not really sure what those obvious reasons are. The EVO Shift 4G has a slower processor, worse GPU, smaller screen, LCD instead of SAMOLED, and on all other points save Android 2.2 just about comes even with the Epic 4G. It's an attractive phone, and it probably has reasonable build quality (haven't had one in my hands yet) but I fail to see why it would be worth switching from an Epic 4G for.
Click to expand...
Click to collapse
I don't believe the processor is slower. Just because it has a slower clock speed doesn't make it slower.
Sent from my SPH-D700 using XDA App
True...but its slower. Hummingbird is the fastest mobile processor until the dual cores come out.
Sent from my SPH-D700 using XDA App
Electrofreak said:
I'm not really sure what those obvious reasons are. The EVO Shift 4G has a slower processor, worse GPU, smaller screen, LCD instead of SAMOLED, and on all other points save Android 2.2 just about comes even with the Epic 4G. It's an attractive phone, and it probably has reasonable build quality (haven't had one in my hands yet) but I fail to see why it would be worth switching from an Epic 4G for.
Click to expand...
Click to collapse
the only really good thing with the sliding keyboard is that it has no spring for it to kick out/back in. so less of a chance of it snaping/breaking anything. but its annoying to slide it. quadrant score on my store demo got 1298. beat our demo Evo by 200. so its not too bad actually for speed
I could swap phones with my old lady. She's got my evo. She doesn't really care about bigger badder better. Stuff I didn't like with the evo was its battery life, the radios were kinda deaf, and it wasn't very tolerant to temperature in the summer. Taking samsung out of the picture really narrows down the choices.
Yes, I'm impatient waiting for the real samsung update. That's my issues of obvious reasons. Plus, there's a lot of community development for htc compared to epic. Which will come first, official froyo or final (not beta or rc) cyanogen? That is the 164,000 dollar question. No 4g love either yet. I know, I'm ungrateful and too picky...
My wife traded her transform for one today, it pulled a 1634 in quadrant and over 34 in linpack before we left the parkinglot... maxed at 59fps too. 100% stock obviously.
Sent from my HERO200 using XDA App
I'm due 4 an upgrade in Feb..it was going to be an evo but damn that phone flys!
Sent from my HERO200 using XDA App
nope itll take something better than the epic for that
id love an epic with no 4g
it seems largely based off the g2 to me...so it should be similar?
And even though its 800mhz, its still better than the Evo's at 1ghz.
Also, it should be a whole lot more dev friendly since htc likes to use the same stuff, so it should be easy to OC it to a stable 1.6ghz like the g2 or a 2ghz unstable bull
But I think that'd still be a major downgrade from the epic.
The only upside in my opinion is the better development its guaranteed to have,or atleast easier development since its basically already been worked on so much as the Desire Z
Sent from my SPH-D700 using XDA App
muyoso said:
Trade the best phone on sprint for a midrange phone? GREAT IDEA. /s
Click to expand...
Click to collapse
I'm going trade my epic in tomorrow for a sanyo m1 lol
Sent from my SPH-D700 using XDA App
No, but I am jumping ship for the atrix!
Boo
raylusk said:
I don't believe the processor is slower. Just because it has a slower clock speed doesn't make it slower.
Sent from my SPH-D700 using XDA App
Click to expand...
Click to collapse
Trust me, it's slower. I'm a tech blogger and I've written about Hummingbird and Snapdragon. Google "Hummingbird vs. Snapdragon" and click the first link.
The Hummingbird beats Snapdragon MHz for MHz in processing power due to some heavy tweaking by the engineers at Intrinsity (which is now owned by Apple, the bastards!)
The MSM7630 in the HTC Knight is Qualcomm's 2nd-Gen "low-end" Snapdragon. It's not really a Snapdragon per se (Qualcomm chose to omit that brand from the MSM line of SoCs), but it's got the Scorpion CPU inside which is the backbone of the Snapdragon platform. It features a better graphics GPU than its Snapdragon predecessors, an Adreno 205 instead of an Adreno 200, which just about doubles graphics performance from what I've seen. However, it still doesn't come close to the PowerVR SGX540 our Epic 4G is rocking, which is still nearly twice as powerful as the Adreno 205. One other improvement the MSM7630 has is that it's manufactured on the 45 nm feature size, whereas the first-gen Snapdragons like the one in the EVO 4G are running on a 65 nm process, and are significantly less power efficient. The Epic 4G however already is on the 45 nm process and achieves about the same level of power efficiency.
From a hardware standpoint, going from an Epic 4G to an HTC EVO Shift 4G is a downgrade in every way.
Electrofreak said:
Trust me, it's slower. I'm a tech blogger and I've written about Hummingbird and Snapdragon. Google "Hummingbird vs. Snapdragon" and click the first link.
The Hummingbird beats Snapdragon MHz for MHz in processing power due to some heavy tweaking by the engineers at Intrinsity (which is now owned by Apple, the bastards!)
The MSM7630 in the HTC Knight is Qualcomm's 2nd-Gen "low-end" Snapdragon. It's not really a Snapdragon per se (Qualcomm chose to omit that brand from the MSM line of SoCs), but it's got the Scorpion CPU inside which is the backbone of the Snapdragon platform. It features a better graphics GPU than its Snapdragon predecessors, an Adreno 205 instead of an Adreno 200, which just about doubles graphics performance from what I've seen. However, it still doesn't come close to the PowerVR SGX540 our Epic 4G is rocking, which is still nearly twice as powerful as the Adreno 205. One other improvement the MSM7630 has is that it's manufactured on the 45 nm feature size, whereas the first-gen Snapdragons like the one in the EVO 4G are running on a 65 nm process, and are significantly less power efficient. The Epic 4G however already is on the 45 nm process and achieves about the same level of power efficiency.
From a hardware standpoint, going from an Epic 4G to an HTC EVO Shift 4G is a downgrade in every way.
Click to expand...
Click to collapse
do you think it would be possible for you to do the same type of review on the camera for the epic against the evo and the iphone 4g? I read your last blog on the processors, and I must say dude im prety technical and you blew me away with your analysis!
boominz28 said:
do you think it would be possible for you to do the same type of review on the camera for the epic against the evo and the iphone 4g? I read your last blog on the processors, and I must say dude im prety technical and you blew me away with your analysis!
Click to expand...
Click to collapse
Eh, I'm not a photo guy, I don't do cameras. I don't know jack about all that stuff and I'm afraid I wouldn't do a very good job. :-\
I have been getting bugged to do a review of Cortex-A9 (specifically Tegra 2, Orion, OMAP 4400 etc) as well as the 2nd/3rd gen Snapdragons, but I'm working on my CCNA and MSCITP certifications right now and I'm trying not to let myself get too distracted; once I start researching and writing all of my spare time gets flushed down the crapper!
flawlessbmxr said:
the only really good thing with the sliding keyboard is that it has no spring for it to kick out/back in. so less of a chance of it snaping/breaking anything. but its annoying to slide it. quadrant score on my store demo got 1298. beat our demo Evo by 200. so its not too bad actually for speed
Click to expand...
Click to collapse
the5ifty said:
My wife traded her transform for one today, it pulled a 1634 in quadrant and over 34 in linpack before we left the parkinglot... maxed at 59fps too. 100% stock obviously.
Sent from my HERO200 using XDA App
Click to expand...
Click to collapse
The problem is that you guys are using Quadrant and Linpack. Snapdragon CPUs always perform better in Quadrant because it isn't a well-built benchmark. Linpack is fooled by Snapdragon's Virtual Floating Point extension.
I'd like to see the scores it gets on Smartbench 2010, a new benchmark tool that I'm still not sure I trust completely but definitely seems to be more accurate than Quadrant.
herbthehammer said:
I could swap phones with my old lady. She's got my evo. She doesn't really care about bigger badder better. Stuff I didn't like with the evo was its battery life, the radios were kinda deaf, and it wasn't very tolerant to temperature in the summer. Taking samsung out of the picture really narrows down the choices.
Yes, I'm impatient waiting for the real samsung update. That's my issues of obvious reasons. Plus, there's a lot of community development for htc compared to epic. Which will come first, official froyo or final (not beta or rc) cyanogen? That is the 164,000 dollar question. No 4g love either yet. I know, I'm ungrateful and too picky...
Click to expand...
Click to collapse
Have you guys read any posts from the senior members/devs? QUADRANT IS A JOKE. It caters toward snapdragon SoC's quadrant means nothing in R/L speeds.
sent from my brain telepathically :-D
herbthehammer said:
I have to go and pay my bill up to date tomorrow. I am very seriously thinking about the evo shift for obvious reasons. Does anyone have any thoughts on this subject or actually bought it? I'm interested in what you have to say.
Click to expand...
Click to collapse
if you're gonna trade-in, trade-in for an upgrade like a Tegra 2 chipset phone. Not a downgrade to a HTC Swift.
FYI, Sprint has no Tegra 2 phones in their stables right now.
Go with the Evo 4g , or something other than a Samsung phone I just swapped it for my wife's Evo and I don't regret it!support wise is a lot better (meaning custom roms, etc...). The only thing I miss is the SAMOLED thats it. The Evo Shift feels very smooth just like a G2 and once someone ports that Desire Z rom its going to be very nice!
For me, the most important downgrade would be the lack of a front-facing camera on the Shift. When I am deployed, it would be much nicer to be able to video-chat with my wife when I can find a wifi spot with my Epic.
Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Div033 said:
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed.
Click to expand...
Click to collapse
I thought the biggest problem with the Nexus One was the limited space for system files. Other Adreno 200 devices, such as the Evo 4G, have Android 4.0 running on them, and I hear it works really well.
I know that the early Exynos found on the Nexus S also works quite well.
I think any modern chipset easily surpasses the performance required for the type of GPU tasks being implemented at the system level. Games are still a concern, but compatibility is more of an issue there than performance, and the Adreno 225 is popular enough that it should be supported.
But there's always next year for really kick-ass GPUs.
Div033 said:
Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
Well, you also have to look at resources available and time constraints. The introduction of LTE in the US probably forced said chip maker to make some concessions. What they lost in state of the art GPU, the gained in the ridiculous profit they made this year because they are the only chip that includes LTE. From their perspective, they've won the war thus far.
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
real world performance > benchmarks
My Sensation has an Adreno220 and it plays every game and movie just fine. Sure it doesn't get the best benchmark numbers but it more than holds it's own when playing any game. I'm sure the Adreno225 will hold up just fine over the next couple years. In fact I still love my Sensation. Side by side it's still just as fast as most phones out there. You only see a difference when running benchmarks which isn't practical. I personally don't care if i'm getting 200fps or 50. It's not like anyone can tell the difference.
I also want to note that the 220 is crazy powerful compared to the 200 and 205. It was the first GPU that Qualcomm seemed to really take a stab at gaming with. I'm fine with the 220 and can't wait to begin using the 225.
bradleyw801 said:
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
Click to expand...
Click to collapse
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu, Ram, Resolution etc... have a ton to do with it as well.
Will the GPU do better in this phone due to the extra ram compared to the one series with the same S4/225 combo?
Div033 said:
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
You should also note that the GS2 only had 480 x 800 (384000 total pixels) resolution and even at that way lower resolution it's score was only slightly higher in that test whereas the GS3 is pushing 720x1280 (921600 total pixels). That means that the GS3 is working 2.4 times harder than the GS2 and it delivers almost the same gaming performance at worst, and better performance in others. That's not bad if you ask me seeing as how we all thought the GS2 was a powerhouse just 12 months ago.
incubus26jc said:
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu and Ram have a ton to do with it as well.
Click to expand...
Click to collapse
agreed, i haven't seen anyone suffering from GPU woes other than benchmark nuts who obsess over the highest score.....everyone actually using it for real things say it works great.....and honestly, my take is if you want gaming performance, don't use a phone, plug your 360 into your big screen and kick ass from the couch in HD and surround sound
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
Well it might interest you to know the Adreno 225 supports DirectX 9.3 and texture compression where Mali 400 does not. Its a requirement for Windows 8. Now, you might say so what....but I for one plan on trying to dual boot or even run a version of Windows RT perhaps on a virtual machine. Something else that the S4 Krait/Adreno package supports natively I do believe, that the Exynos/Mali doesn't.
Sent from my DROIDX using xda premium
Div033 said:
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
Click to expand...
Click to collapse
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Voltage Spike said:
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Click to expand...
Click to collapse
+1. I do believe more ram (largest among this generation phones) matters for long term usage.
The GPU is fine.
Guys, this is a Galaxy S phone. The newest one at least.
It is GUARANTEED a Jelly Bean update from Samsung (albeit late). It is also most likely getting at least 1 or 2 more major Android updates because of XDA.
Remember, ALL OF US has the SAME Galaxy S3. That is a LOT of devs that will be working on it.
Don't worry about that. It will come with time.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
nativestranger said:
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
Click to expand...
Click to collapse
Fair enough. I suppose you're right, the Adreno 200 was already severely underpowered at launch. The 225 may not be the best, but it's still up among the top tier GPUs. I guess I have nothing to worry about. The 2GB ram is definitely nice too.
Sent from my Droid Incredible using the XDA app.
Just put this here for a reference:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The nexus one is running the Adreno 200. The htc one v with Adreno 205 is over 5X faster. The rezound has an Adreno 220. over 3X faster than the one V but also running more than 2X the resolution. The GS3 with adreno 225 is hard up against the vsync wall in he hoverjet test and about 3x faster than the rezound in the egypt test. It's amazing how much Adreno has upgraded in just 2 years. From the 2fps on the 200 to never dropping below 60fps on the 225.
Thank you this helps me make my decision too ^. Also does having a high resolution screen make graphics look better? Like NOVA 3 on my SGS2 looks awesome. All the effects are there bullet smoke and what not. So will these effects or the graphics in general look better on this sgs3 screen?
Thanks!
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Cruiserdude said:
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Click to expand...
Click to collapse
I see lol well that's good I don't wanna have to buy a new phone every half a year! But will the HD resolution make any of the game loft games look any better than they do on my galaxy s2 with the Mali 400 gpu? Thanks!
Sent from my SPH-D710 using XDA Premium App