First the G2, now the Lexicon:
http://phandroid.com/2010/09/20/htc-lexikon-looks-to-be-next-verizon-droid/
Sure the clock speed is lower, but reports are saying that the processor is actually faster. And the battery usage will probably be a lot better too.
I'm a sucker for performance and have always said I'd stick with the N1 until the next CPUs come out. Finally... Has the next era in mobile CPU's finally begun?
Next era, no. 1.5+single cores, then dual core.
Paul22000 said:
First the G2, now the Lexicon:
http://phandroid.com/2010/09/20/htc-lexikon-looks-to-be-next-verizon-droid/
Sure the clock speed is lower, but reports are saying that the processor is actually faster. And the battery usage will probably be a lot better too.
I'm a sucker for performance and have always said I'd stick with the N1 until the next CPUs come out. Finally... Has the next era in mobile CPU's finally begun?
Click to expand...
Click to collapse
It's faster 'cause the gpu is a logically separate device. I expect linpacks to be somewhat slower, but quadrants to be faster. How's it going?
"Next era"? No. 7x30 isn't a direct successor to 8x50, having the same CPU but different GPU and some other internal differences (for example, LPDDR2 support appears on Github). Just read Qualcomm's own product description:
http://www.qualcomm.com/products_services/chipsets/snapdragon.html
It's called "second generation" because of HSPA+, much better GPU, 45nm process, additional video codecs support, newer GPS, and some other bits and pieces. It's an overall better device. But if you count only the CPU area - it loses to Nexus. Same CPU, clocked lower. 8x55 is equal in CPU power.
If you're looking for the real next generation in power - look for 3rd generation devices, with dual core CPUs.
Jack_R1 said:
"Next era"? No. 7x30 isn't a direct successor to 8x50, having the same CPU but different GPU and some other internal differences (for example, LPDDR2 support appears on Github). Just read Qualcomm's own product description:
http://www.qualcomm.com/products_services/chipsets/snapdragon.html
It's called "second generation" because of HSPA+, much better GPU, 45nm process, additional video codecs support, newer GPS, and some other bits and pieces. It's an overall better device. But if you count only the CPU area - it loses to Nexus. Same CPU, clocked lower. 8x55 is equal in CPU power.
If you're looking for the real next generation in power - look for 3rd generation devices, with dual core CPUs.
Click to expand...
Click to collapse
+1. I concur 100% with what he said.
keep in mind that pure clock speed does not mean something is faster... the 45nm die shrink also means they increased efficiency in a lot of areas and have allowed for more cache on the die...
think of it this way, i built a dual core PC back in 2006 that ran at 2.8ghz but it was like 90nm tech... if i buy a new dual core today, with a 45nm tech but same speed it would blow the old proc out of the water...
I really doubt dual core procs in phones will make a huge leap like everyone is expecting... I mean, how often do you run 4-5 apps simultaneously that are all very stressful on the CPU? the two most stressful things you prolly do on your phone is watch a movie (encoding video is stressful) or play a video game like on your PSX emulator... do you ever watch a movie and play a game at the same time? Stupid question right... the basic everyday performances are not going to see any huge improvements like everyone expects...
if they want to improve phones they should stick to single core and have a dedicated gpu or go dual and prioritize one of the cores to graphical processing...
oh i forgot to mention the only way you will see strong software performance improvements from dual core is if Google rewrites virtually the entire code for Android to make use of multiple cores... so while your phone might be dual core, your OS wont care since it virtually cannot use it correctly... better pray the manufacturer updates the OS for you cuz the N1 is single core and guess whos getting all the updates for the next year or so?
Pure clock speed on exactly the same CPU is directly correlated with CPU speed. Yes, there are some things that impact benchmarks like memory bandwidth etc, but we're not talking about them - and even if we did, the difference still wouldn't cover. 65nm vs 45nm means NOTHING - it doesn't matter, what process the CPU was built on, it matters how it functions. We're talking about EXACTLY THE SAME CPU, can you keep that in mind, please? Thanks. CPU cache almost doesn't matter, since L1 is limited anyway, and L2 is big enough anyway, the increases add a bare couple of percents to CPU speed, which is nothing compared to 20% speed loss due to clocking.
Thanks for your smart suggestions on "improving phones". I guess you might be one of the VPs at Qualcomm. Or maybe you aren't. I'll skip your even smarter comments about "dedicated GPU" etc. I guess you probably need to google the word "SoC" first and see what it means.
And you should probably educate yourself about multi-threaded applications, and also remember that Linux kernel (which is running on Android) is built to support multiple cores, and Dalvik VM (which is running the apps) might very well be multi-threaded too.
Adding a second core with load balancing OS results in ~35-40% performance increase (depends on some things). And ironically, when you compare "your old 90nm core" and "newer 45nm cores", saying that the newer cores clocked similarly "would blow the old out of the water", you're actually comparing multi-core vs single-core CPUs (with some internal speed-ups, too, but the most significant performance boost comes from additional cores).
Jack_R1 said:
65nm vs 45nm means NOTHING
Click to expand...
Click to collapse
Correct me if I'm wrong, but won't the 45nm process at least have better efficiency due to smaller gates?
Related
Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.
I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.
I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm
Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy
voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.
well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.
Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!
This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?
gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.
The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...
Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.
thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.
I'm probably the only person on this planet that would ever download a 20.5-meg, 2426-page document titled "S5PC110 RISC Microprocessor User's Manual", but if there are other hardware freaks out there interested, here you go:
http://pdadb.net/index.php?m=repository&id=644&c=samsung_s5pc110_microprocessor_user_manual_1.00
As you may or may not know, the S5PC110, better known as Hummingbird, is the SoC (System on a Chip) that is the brain of your Epic. Now, when you have those moments when you really just gotta know the memory buffer size for your H.264 encoder or are dying to pore over a block diagram of your SGX540 GPU architecture, you can!
( Note: It does get a little bit dry at parts. Unless you're an ARM engineer, I suppose. )
Why arent you working on porting CM6 or gingerbread via CM7?? lol
now we can overclock the gpu
/sarcasm
cbusillo said:
Why arent you working on porting CM6 or gingerbread via CM7?? lol
Click to expand...
Click to collapse
Hah, because I know exactly squat about Android development. Hardware is more my thing, though if I find some spare time to play around with the Android SDK maybe that can change.
Sent from my SPH-D700 using XDA App
This actually is really exciting news. RISC architectures in general, especially the ARM instruction set is great and honestly it would so the works a lot of good kicking the chains of x86
Sent from my Nexus S with a keyboard
Interesting - the complete technical design of the Hummingbird chips.
After reading your blog as to how Hummingbird got its extra performance, I still wonder at times - did we make the right choice in getting this phone the Epic 4G (I bought one for $300 off contract and imported it to Canada) knowing that there are going to be ARM Cortex A9 CPUs coming around in just a couple of months? We know that in the real world, Hummingbird is more powerful than Snapdragon and the OMAP 3600 series, while benchmark scores tend to not reflect real world performance.
Performance-wise: It's know that the out of order A9 parts are at least 30% faster clock for clock in real world performance. There will be dual and maybe quad core implementations. What's really up in the air is the graphics performance of the A9 parts. There's now the Power VR SGX 545, the Mali 400, and the Tegra 2.
Edit: There is also the successor, the Mali T-604. I don't expect to see this in a phone in the near future. Nor do I expect the Tegra 3. Maybe close to this time next year though.
sauron0101 said:
Interesting - the complete technical design of the Hummingbird chips.
After reading your blog as to how Hummingbird got its extra performance, I still wonder at times - did we make the right choice in getting this phone the Epic 4G (I bought one for $300 off contract and imported it to Canada) knowing that there are going to be ARM Cortex A9 CPUs coming around in just a couple of months? We know that in the real world, Hummingbird is more powerful than Snapdragon and the OMAP 3600 series, while benchmark scores tend to not reflect real world performance.
Performance-wise: It's know that the out of order A9 parts are at least 30% faster clock for clock in real world performance. There will be dual and maybe quad core implementations. What's really up in the air is the graphics performance of the A9 parts. There's now the Power VR SGX 545, the Mali 400, and the Tegra 2.
Edit: There is also the successor, the Mali T-604. I don't expect to see this in a phone in the near future. Nor do I expect the Tegra 3. Maybe close to this time next year though.
Click to expand...
Click to collapse
Your always going to be playing catchup..I personally think the Epic has great hardware for the time...I mean on Samsung's roadmap for 2012/13 is their Aquila processor which is a quad-core 1.2ghz..its going to be endless catchup..every year there will be something that completely over shallows the rest..
gTen said:
Your always going to be playing catchup..I personally think the Epic has great hardware for the time...I mean on Samsung's roadmap for 2012/13 is their Aquila processor which is a quad-core 1.2ghz..its going to be endless catchup..every year there will be something that completely over shallows the rest..
Click to expand...
Click to collapse
No, but I mean, if you buy the latest technology when its released, you'll be set for quite some time.
For example, if you were to buy the one of the first Tegra 2 phones, its unlikely that anything is going to be beating that significantly until at least 2012 when the quad core parts begin to emerge.
It takes a year or so from the time that a CPU is announced to the time that it gets deployed in a handset. For example, the Snapdragon was announced in late 2008 and the first phones (HD2) were about a year later. IF you buy an A9 dual core part early on, you should be set for some time.
Well, I got the Epic knowing Tegra 2 was coming in a few months with next-gen performance. I was badly in need of a new phone and the Epic, while not a Cortex A9, is no slouch.
Sent from my SPH-D700 using XDA App
sauron0101 said:
No, but I mean, if you buy the latest technology when its released, you'll be set for quite some time.
For example, if you were to buy the one of the first Tegra 2 phones, its unlikely that anything is going to be beating that significantly until at least 2012 when the quad core parts begin to emerge.
Click to expand...
Click to collapse
Thats relative, in terms of GPU performance our Hummingbird doesn't do so badly..the GPU the TI chose to pair with the dual core OMAP is effectively a PowerVR SGX540..the Snapdragon that is rumored to be in the dual cores next summer is also on par with our GPU performance...so yes we will loose out to newer hardware..which is to be expected but I wouldn't consider it a slouch either...
It takes a year or so from the time that a CPU is announced to the time that it gets deployed in a handset. For example, the Snapdragon was announced in late 2008 and the first phones (HD2) were about a year later. IF you buy an A9 dual core part early on, you should be set for some time.
Click to expand...
Click to collapse
The first phone was a TG01, that said I guarantee you that a year if not less from the first Tegra release there will be a better processor out...its bound to happen..
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
gTen said:
Thats relative, in terms of GPU performance our Hummingbird doesn't do so badly..the GPU the TI chose to pair with the dual core OMAP is effectively a PowerVR SGX540..the Snapdragon that is rumored to be in the dual cores next summer is also on par with our GPU performance...so yes we will loose out to newer hardware..which is to be expected but I wouldn't consider it a slouch either...
The first phone was a TG01, that said I guarantee you that a year if not less from the first Tegra release there will be a better processor out...its bound to happen..
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
Click to expand...
Click to collapse
AFAIK, dual-core support is only fully supported by Honeycomb. But if you feel like buying into NVIDIA's explanation of Tegra 2 performance, check this out: http://www.nvidia.com/content/PDF/t...-Multi-core-CPUs-in-Mobile-Devices_Ver1.2.pdf
Electrofreak said:
AFAIK, dual-core support is only fully supported by Honeycomb. But if you feel like buying into NVIDIA's explanation of Tegra 2 performance, check this out: http://www.nvidia.com/content/PDF/t...-Multi-core-CPUs-in-Mobile-Devices_Ver1.2.pdf
Click to expand...
Click to collapse
I see I actually read before that Gingerbread would allow for dual core support but I guess that was delayed to honeycomb...
either way this would mean even if a Tegra based phone comes out it wont be able to utilize both cored until at least mid next year.
I can't open pdfs right now but I read a whitepaper with performance of hummingbird and Tegra 2 compared both on single core and dual core..is that the same one?
One thing though is Nvidia and ATI are quite known for tweaking their gfx cards to perform well on benchmarks...I hope its not the same with their CPUs :/
gTen said:
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
Click to expand...
Click to collapse
Here are some additional benchmarks comparing the Galaxy Tab to the Viewsonic G Tablet:
http://www.anandtech.com/show/4062/samsung-galaxy-tab-the-anandtech-review/5
It's possible that the Tegra 2 isn't optimized yet. Not to mention, Honeycomb will be the release that makes the most of dual cores. However, there are lackluster performance gains in terms of graphics - most of it seems to be purely CPU gains in performance.
I'm not entirely sure that Neocore is representative of real world performance either. It's possible that it may have been optimized for some platforms. Furthermore, I would not be surprised if Neocore gave inflated scores for the Snapdragon and it's Adreno graphics platform. Of course, neither is Quadrant.
I think that real world games like Quake III based games are the way to go, although until we see more graphics demanding games, I suppose that there's little to test (we're expecting more games for Android next year).
Finally, we've gotten to a point for web browsing where its the data connection HSPA+, LTE, or WiMAX that will dictate how fast pages load. It's like upgrading the CPU for a PC. I currently run an overclocked q6600 - if I were to upgrade to say a Sandy Bridge when it comes out next year, I don't expect significant improvements in real world browsing performance.
Eventually, the smartphone market will face the same problem that the PC market does. Apart from us enthusiasts who enjoy benchmarking and overclocking, apart from high end gaming, and perhaps some specialized operations (like video encoding which I do a bit of), you really don't need the latest and greatest CPU or 6+ GB of RAM (which many new desktops come with). Same with high end GPUs. Storage follows the same dilemna. I imagine that as storage grows, I'll be storing FLAC music files instead of AAC, MP3, or OGG, and more video. I will also use my cell phone to replace my USB key drive. Otherwise, there's no need for bigger storage.
gTen said:
I see I actually read before that Gingerbread would allow for dual core support but I guess that was delayed to honeycomb...
either way this would mean even if a Tegra based phone comes out it wont be able to utilize both cored until at least mid next year.
I can't open pdfs right now but I read a whitepaper with performance of hummingbird and Tegra 2 compared both on single core and dual core..is that the same one?
One thing though is Nvidia and ATI are quite known for tweaking their gfx cards to perform well on benchmarks...I hope its not the same with their CPUs :/
Click to expand...
Click to collapse
Gingerbread doesn't have any dual-core optimizations. It has some JIT improvements in addition to some other minor enhancements, but according to rumor, Honeycomb is where it's at, and it's why the major tablet manufacturers are holding off releasing their Tegra 2 tablets until it's released.
And yeah, that paper shows the performance of several different Cortex A8s (including Hummingbird) compared to Tegra 2, and then goes on to compare Tegra 2 single-core performance vs dual.
Electrofreak said:
Gingerbread doesn't have any dual-core optimizations. It has some JIT improvements in addition to some other minor enhancements, but according to rumor, Honeycomb is where it's at, and it's why the major tablet manufacturers are holding off releasing their Tegra 2 tablets until it's released.
And yeah, that paper shows the performance of several different Cortex A8s (including Hummingbird) compared to Tegra 2, and then goes on to compare Tegra 2 single-core performance vs dual.
Click to expand...
Click to collapse
I looked at:
http://androidandme.com/2010/11/new...u-will-want-to-buy-a-dual-core-mobile-device/
since I can't access the pdf..does the whitepaper state what version they used to do their tests? for example if they used 2.1 on the sgs and honeycomb on their tests it wouldn't exactly be a fair comparison...do they also put in the actual FPS..not % wise? for example we are capped on the FPS for example...
Lastly, in the test does it say whether the Tegra 2 was dithering at 16bit or 24bit?
gTen said:
I looked at:
http://androidandme.com/2010/11/new...u-will-want-to-buy-a-dual-core-mobile-device/
Click to expand...
Click to collapse
I'm one of Taylor's (unofficial) tech consultants, and I spoke with him regarding that article. Though, credit where it's due to Taylor, he's been digging stuff up recently that I don't have a clue about. We've talked about Honeycomb and dual-core tablets, and since Honeycomb will be the first release of Android to support tablets officially, and since Motorola seems to be holding back the release of its Tegra 2 tablet until Honeycomb (quickly checks AndroidAndMe to make sure I haven't said anything Taylor hasn't already said), and rumors say that Honeycomb will have dual-core support, it all makes sense.
But yes, the whitepaper is the one he used to base that article on.
gTen said:
since I can't access the pdf..does the whitepaper state what version they used to do their tests? for example if they used 2.1 on the sgs and honeycomb on their tests it wouldn't exactly be a fair comparison...do they also put in the actual FPS..not % wise? for example we are capped on the FPS for example...
Lastly, in the test does it say whether the Tegra 2 was dithering at 16bit or 24bit?
Click to expand...
Click to collapse
Android 2.2 was used in all of their tests according to the footnotes in the document. While I believe that Android 2.2 is capable of using both cores simultaneously, I don't believe it is capable of threading them separately. But that's just my theory. I'm just going off of what the Gingerbread documentation from Google says; and unfortunately there is no mention of improved multi-core processor support in Gingerbread.
http://developer.android.com/sdk/android-2.3-highlights.html
As for FPS and the dithering... they don't really go there; the whitepaper is clearly focused on CPU performance, and so it features benchmark scores and timed results. I take it all with a pinch of salt anyhow; despite the graphs and such, it's still basically an NVIDIA advertisement.
That said, Taylor has been to one of their expos or whatever you call it, and he's convinced that the Tegra 2 GPU will perform several times better than the SGX 540 in the Galaxy S phones. I'm not so sure I'm convinced... I've seen comparable performance benchmarks come from the LG Tegra 2 phone, but Taylor claims it was an early build with and he's seen even better performance. Time will tell I suppose...
EDIT - As for not being able to access the .pdfs, what are you talking about?! XDA app / browser and Adobe Reader!
so now that the latest beta of siyah kernel supports enabling/disabling of the 2nd core, and tegrak already released an app for it, i just want to know the possible effects in performance/battery if you use the different options of the 2nd core app.. especially when we use the single core option.. so what will happen to our phone when we run HD games, and im sure that it will extend the battery life, just not sure how the phone will behave with only 1 core running.. and will it be bad for our phone to only run at a single core..
and also, am i right to assume that our phone has the option "dynamic hotplug" by default?
Shouldn't see much of a decrease in the performance. The sgs has a single core yet the cpu can still handle anything thrown against it. Point being there is nothing out that demands dual core performance. On another note note, hd games are not actually gd. It is just advertising point for game developers.
$1 gets you a reply
Using one core instead won't break your cpu. It gonna make your phone cooler ( ! core is running producing less heat and the heat dissipator is made for the dual core ) and have a better battery life obviously. It will, obviously too, slow down your phone, but the speed lost is to be determined. You might want to test it out to see if it's getting laggy or simply suck. As already said, the SGS I has a 1Ghz proc and can handle most of the top recent content available so with a 1.2 Ghz single core, you should be able to handle everything available, specially with an optimized kernel like siyah. And you are right, the default mode is dynamic hotplug, which use both core when needed and turn the core 1 ( 2nd core ) off when not needed.
I tried playing a little with it. The overall smootness doesn't change and i get about the same fps in nenamark2. The only game i saw stuttering a little more in single mode was Shadowgun, the others are just the same. I also have the feeling that cpu noise is reduced while playing music through headsets when you run on single.
I like the idea of switching off one core. But while using only one core this leads to a higher load on that corse. This will result in higher frequencies an thus higher battery consumption?
So might using only one core even be worse for battery life?
I mean isn't that the reason why you use multiple cores? That one does not have to produce cpu with high frequencies? I think I once read that the energy a cpu uses it proportional to the frequency squared. So it is not a linear relation. That means two cores on 500 MHz are using less power than one cpu on 1000 Mhz. Can someone confirm that? So if th os is optimized for multiple cores the energy consumptions will be less.
What do you think or know about Android. Is it managing two cores intelligently an thus reducing energy consumption or are we doing better with switching off one core?
Hi,
is anybody out there who can share any experiences with this 2nd Core app?
It would be very interesting whether it really saves battery(and if yes, is it noticeably or is it a huge difference)? Are there any negative effects in speed oder stability?
Rgds
I don't particularly care about potential battery saving, but I use it to manually disable one core while playing games which have problems with SoundPool ( see http://code.google.com/p/android/issues/detail?id=17623 ), such as Galcon, as this mitigates the problems.
Schindler33 said:
I like the idea of switching off one core. But while using only one core this leads to a higher load on that corse. This will result in higher frequencies an thus higher battery consumption?
So might using only one core even be worse for battery life?
I mean isn't that the reason why you use multiple cores? That one does not have to produce cpu with high frequencies? I think I once read that the energy a cpu uses it proportional to the frequency squared. So it is not a linear relation. That means two cores on 500 MHz are using less power than one cpu on 1000 Mhz. Can someone confirm that? So if th os is optimized for multiple cores the energy consumptions will be less.
What do you think or know about Android. Is it managing two cores intelligently an thus reducing energy consumption or are we doing better with switching off one core?
Click to expand...
Click to collapse
totlly agree
Hi,
I was wondering if the 2 CPU's are working simultaneously together? or I'st just 1?., I'm using FLEXREAPER X10 ICS 4.0.3 . Sometimes I get screen glitches .... when My tab is trying to sleep and I touched the screen. Also...when I try the benchmark it only say's the CPU1 processing speed... & etc. Also when I'm browsing in the Playstore the screen animation is a bit lag... Can some1 enlighten me...or is there an app for this? than can force 2 cpu to work all the time together.?
Yes, both cores are enabled at all times. But no, you cannot make an application use both cores unless the application was designed to do so.
FLEXREAPER X10 ICS 4.0.3 base a leak rom ICS, not a stable rom, so it has some problems.
Your benchmark is correct.
There are NOT 2 CPU's. There is only one CPU, with 2 cores. It doesn't process two applications at once, it CAN process two threads of the same application at the same time. Think of it as this: two CPUs would be two people writing on different pieces of paper.A single CPU with two cores would be one person writing with both hands at the same time. He can only write on the same piece of paper, but it's faster then it would be if he was writing with only one hand.
Note: this is not related to multi-task. Multi-tasking works based on processing a little bit of each app at a time, so altough it may seen that both are running at the same time, it is not.
Most apps are not designed to work with threads though, so there's your (actually, our) problem. But this is not an A500 problem, it applies to any multi-core processor based devices ou there (including desktops).
danc135 said:
There are NOT 2 CPU's. There is only one CPU, with 2 cores
Click to expand...
Click to collapse
Essentially true, but...
It doesn't process two applications at once
Click to expand...
Click to collapse
False. Two cores is just two CPUs on the same die.
Thanks for the response guys... I'm getting bit confused with this "multi-core processor".... I was expecting that it is fast to no lag, during browsing apps in my lib,switching application, even browsing in The PlAYSTORE". So It's correct to say that multi-core processor is a bit of a waste if an app can't use it's full/all cores potential? Also if the UI of an OS can't use all cores at the same time?
Dual Core, Dual CPU....
Not entirely, because if the kernel is capable of multi-threading, then it can use one core to run services while another is running the main application. The UI is only another application running on top of the kernel...
The only difference between a dual core Intel cpu and a dual core tegra 2 is the instruction set and basic capabilities, otherwise they can be thought of as essentially the same animal. The kernel, which is the core of the OS, handles the multi-tasking, but android has limited multi-tasking capabilities for Applications. Even so, services that run in the background are less of a hindrance on a dual core cpu than a single core one, and more and more applications are being written to take advantage of multiple cores.
Just have a bunch of widgets running on your UI, and you are looking at multi-tasking and multi-threading. Which are both better on multi-core processors.
A multiple core cpu are not more then one processor stacked on one die. They thread and load balance thru software.Applications MUST BE AWARE Of multi core cpus to take advantage of the dual cores.
A multiple Processor computer has a 3rd processor chip on the main board. this chip balances the load on hardware. this does not add over head on the processors. as on a Dual multi CORE CHIP. has a much higher load overhead.
SO Many people confuse the two. This is due to how the companies market the muticore cpu devices .
So a application that can not thread itself on a multi core chip will run in one of the cpu cores. a threaded app can well guess?
a dual Processor computer can run non multi thread aware app or program on two cores..
Its quite simply complicated..
You can throw all the hardware you want at a system. In the end, if the software sucks (not multi-threaded, poorly optimized, bad at resource management, etc...), it's still going to perform bad.
Dual core doesn't mean it can run one applicaton at twice speed, it means that it can run two applications at full speed, given that they're not threaded. Android's largely meant to run one application foregrounded, and since they can't magically make every application multi-threaded, you won't be seeing the benefits of multiple cores as much as you will on a more traditional platform.
Also, a dual-core tegra 2 is good, but only in comparison to other ARM processors (and even then, it's starting to show its age.) It's going to perform poorly compared to a full x86 computer, even one that's older.
netham45 said:
You can throw all the hardware you want at a system. In the end, if the software sucks (not multi-threaded, poorly optimized, bad at resource management, etc...), it's still going to perform bad.
Dual core doesn't mean it can run one applicaton at twice speed, it means that it can run two applications at full speed, given that they're not threaded. Android's largely meant to run one application foregrounded, and since they can't magically make every application multi-threaded, you won't be seeing the benefits of multiple cores as much as you will on a more traditional platform.
Also, a dual-core tegra 2 is good, but only in comparison to other ARM processors (and even then, it's starting to show its age.) It's going to perform poorly compared to a full x86 computer, even one that's older.
Click to expand...
Click to collapse
This is so true . With the exception of a TRUE Server dual OR Quad processor computer.. There is a special on board chip that will thread application calls to balance the load for non threaded programs and games..My first dual processor computer was a amd MP3000 back when dual cpu computers started to be within user price ranges. Most applications/programs did not multi thread .
And yes as far as computer speed and performance you will not gain any from this. but only will feel less lag when running several programs at once.a 2.8 ghz dual processor computer still runs at 2.8 not double that.
erica_renee said:
With the exception of a TRUE Server dual OR Quad processor computer.. There is a special on board chip that will thread application calls to balance the load for non threaded programs and games..
Click to expand...
Click to collapse
Actually this is incorrect. All such decisions are left to the OS's own scheduler, for multiple reasons: the CPU cannot know what kind of tasks it is to run, what should be given priority under which conditions and so on, like e.g. on a desktop PC interactive, user-oriented and in-focus applications and tasks are usually given more priority than background-tasks, whereas on a server one either gives all tasks similar priority or handles tasks priorities based on task-grouping. Not to mention realtime operating system which have entirely different requirements altogether.
If it was left to the CPU the performance gains would be terribly limited and could not be adjusted for different kinds of tasks and even operating systems.
(Not that anyone cares, I just thought to pop in and rant a little...)
Self correction
I said a multi-core processor only runs threads from the same process. That is wrong (thanks to my Computer Architecture professor for misleading me). It can run multiple threads from different processes, which would constitute true parallel processing. It's just better to stick with same process threads because of shared memory within the processor. Every core has its own cache memory (level 1 caches), and shared, on-die level 2 caches.
It all depends on the OS scheduler, really.
With ICS (and future Android versions), I hope the scheduler will improve to get the best of multi-core.
In the end though, it won't matter if applications aren't multi-thread (much harder to code). What I mean is, performance will be better, but not as better as it could be if developers used a lot of multi-threading.
To answer hatyrei's question, yes, it is a waste, in the sense that it has too much untapped potential.
Hi,
I'm pretty curious why all the current Android Wear devices seem to have such powerful hardware built in.
As far as I can tell, almost all the processing is done on the phone, so the SoC should not need to be so fast and power hungry.
Any ideas on why this is?
My Pebble has about 80Mhz single-core Processor (if I read that correctly) and can do many of the things the Android Wear devices can. Of course this is Apples and Oranges, but I think that even with Touchscreen and everything the processing power is unneccesarily overpowered...
Thanks.
Well, Moto 360 has a less powerful CPU. I think the reason is because these companies don't have the ability to design their own custom chips, other than maybe Samsung (who maybe just hasn't had time yet), so they need to use off the shelf chips that already have the drivers and kernels to run Android.
Older processors (like what's in the moto 360) are larger and more power hungry. Newer SoCs like the Snapdragon 400 used in the G Watch and Gear Live have higher-clocked, more powerful cores, but are manufactured with a smaller 28nm process. Smaller means more performance-per-watt. They disable all of the cores except one, which decreases power draw even more. Underclocking the one remaining core then saves even more power, all the while still performing even better than the old chip.
I seriously think Motorola just had a truck load of those TI processors sitting in a warehouse somewhere and was trying to figure out a way to make some money off them. Here's hoping they get rid of them all before the next hardware revision.
CommanderROR said:
Hi,
I'm pretty curious why all the current Android Wear devices seem to have such powerful hardware built in.
As far as I can tell, almost all the processing is done on the phone, so the SoC should not need to be so fast and power hungry.
Any ideas on why this is?
My Pebble has about 80Mhz single-core Processor (if I read that correctly) and can do many of the things the Android Wear devices can. Of course this is Apples and Oranges, but I think that even with Touchscreen and everything the processing power is unneccesarily overpowered...
Thanks.
Click to expand...
Click to collapse
Running lower freq on a powerful cpu is more efficient than running a higher freq on a less powerful cpu
Like what another member have posted, there are perhaps more access to the current stockpile of CPUs which is cheaper than redesigning a new CPU or ordering an out-of-stock CPU (which is costlier)
or we do need to consider the possibility that there is more room for app developers to play with, without having the CPU as a limiting factor.
gtg465x said:
Well, Moto 360 has a less powerful CPU. I think the reason is because these companies don't have the ability to design their own custom chips, other than maybe Samsung (who maybe just hasn't had time yet), so they need to use off the shelf chips that already have the drivers and kernels to run Android.
Click to expand...
Click to collapse
The processors in other watch are not customer chips, Motorola just decided to say we rather save $10 in building the Moto-360 than letting users have a watch that is more responsive and better on battery life.
johnus said:
Older processors (like what's in the moto 360) are larger and more power hungry. Newer SoCs like the Snapdragon 400 used in the G Watch and Gear Live have higher-clocked, more powerful cores, but are manufactured with a smaller 28nm process. Smaller means more performance-per-watt. They disable all of the cores except one, which decreases power draw even more. Underclocking the one remaining core then saves even more power, all the while still performing even better than the old chip.
I seriously think Motorola just had a truck load of those TI processors sitting in a warehouse somewhere and was trying to figure out a way to make some money off them. Here's hoping they get rid of them all before the next hardware revision.
Click to expand...
Click to collapse
This is the best reply thus far. The only other thing I would add on is that using the older processor has already been proven to lower battery life on a SmartWatch. This article is a great example: http://arstechnica.com/gadgets/2014/09/moto-360-review-beautiful-outside-ugly-inside/2/
You'll see there that the Moto 360 has similar overall performance, and lower battery life in the standardized tests he was able to create. This also takes into account his "screen-off" tests with battery life, leading the reviewer to believe the SoC was the culprit.
Thanks.