[Q] CPU-Z identifies CPU as ARM instead of Samsung... - Nexus 10 Q&A, Help & Troubleshooting

Hi,
I just download CPU-Z from Google Play here:
https://play.google.com/store/apps/details?id=com.cpuid.cpu_z
But I'm amazed because it identifies Nexus10 CPU manufacturer as ARM instead of Samsung. In fact CPU manufacturer detected is 0x41, that is ARM not Samsung...
Please, can any owner of another Nexus10 run CPU-Z in order to know if there is the same issue or is just related to my processor.
Thanks and best regards.

Well it is an ARM core design, and the GPU is a straight ARM design as well.

EniGmA1987 said:
Well it is an ARM core design, and the GPU is a straight ARM design as well.
Click to expand...
Click to collapse
I know, but I have a serious overheating problem with my nexus 10. Maybe it can be avoided if there are different processors used in nexus 10 build. That's why I ask if someone can check it in his/her machine, to know if it's possible.
Thanks and best regards.

No one has a different processor, and the overheating affects everyone. Pretty much anything you do will always hit thermal throttling which is why everyone should run a custom kernel since that tweaks the way throttling works.

EniGmA1987 said:
No one has a different processor
Click to expand...
Click to collapse
That's exactly what I want to confirm. Sometimes internal parts were changed/replaced due to lack of stock (and it was a common situation for the nexus 10).
Did you run CPU-Z to check the processor?
Thanks and best regards.

VivaErBetis said:
That's exactly what I want to confirm. Sometimes internal parts were changed/replaced due to lack of stock (and it was a common situation for the nexus 10).
Did you run CPU-Z to check the processor?
Thanks and best regards.
Click to expand...
Click to collapse
It is ARM, mostly. Mali, the GPU portion of the chip, is from ARM.
BTW, there were too few Nexus 10 devices sold to have a part shortage. Even the Surface RT outsold the Nexus 10.

VivaErBetis said:
That's exactly what I want to confirm. Sometimes internal parts were changed/replaced due to lack of stock (and it was a common situation for the nexus 10).
Click to expand...
Click to collapse
Why do you think some Nexus 10's have a different processor? I highly doubt they put in anything other than an Exynos 5 Dual and yet only list that CPU on their official specs page. More likely is that CPU-Z simply doesn't have a complete database of all the ARM processors out there, particularly since the Nexus 10 is the first Android device with a Cortex A15 CPU and there still aren't that many A15 CPUs out there.

joakim_one said:
Why do you think some Nexus 10's have a different processor? I highly doubt they put in anything other than an Exynos 5 Dual and yet only list that CPU on their official specs page. More likely is that CPU-Z simply doesn't have a complete database of all the ARM processors out there, particularly since the Nexus 10 is the first Android device with a Cortex A15 CPU and there still aren't that many A15 CPUs out there.
Click to expand...
Click to collapse
Note the "Field Name" is CPU Architecture, not CPU. This is an important distinction, because CPU-Z is identifying the CPU "type" not the exact make/model from that manufacturer.
Samsung's Exynos CPU used in the N10 was advertised as using the latest "licensed Cortex ARM" design of A15. For more detail see http://www.arm.com/products/processors/cortex-a/index.php for a list of the various ARM-A designs.
ARM doesn't necessarily produce the processors themselves, but develops and then licenses the IP to the various CPU manufacturers that want to create ARM "type" CPU's. Even though it's modified, Qualcomm still pays for the IP so that their instruction sets for Krait are compatible with the competition.
So with CPU-Z, you'll see "ARM Cortex-A15" for our N10's and "Krait" on my Samsung GS3.

SeaFractor said:
Note the "Field Name" is CPU Architecture, not CPU. This is an important distinction, because CPU-Z is identifying the CPU "type" not the exact make/model from that manufacturer.
Samsung's Exynos CPU used in the N10 was advertised as using the latest "licensed Cortex ARM" design of A15. For more detail see http://www.arm.com/products/processors/cortex-a/index.php for a list of the various ARM-A designs.
ARM doesn't necessarily produce the processors themselves, but develops and then licenses the IP to the various CPU manufacturers that want to create ARM "type" CPU's. Even though it's modified, Qualcomm still pays for the IP so that their instruction sets for Krait are compatible with the competition.
So with CPU-Z, you'll see "ARM Cortex-A15" for our N10's and "Krait" on my Samsung GS3.
View attachment 2071624
Click to expand...
Click to collapse
I'm well aware of how ARM licensing works. If you look at his screenshot, it doesn't identify his processor as a Samsung Exynos 5, but your screenshot shows a Qualcomm Snapdragon S4. That is probably because they don't have the Exynos 5 in their database yet.

Thanks for all the answers. I contacted CPU-Z devs to ask about this issue and they change the soc recognition in the new version published today.

Related

[OFFTOPIC] Ipad2 Dual Core CPU made by Samsung

Apple's A5 CPU in iPad 2 confirms manufacturing by Samsung
source: http://www.appleinsider.com/article...ipad_2_confirms_manufacturing_by_samsung.html
That was quite a funny thing to read for the morning breakfast
Ipad2 Dual core CPUs are made by Samsung.
In a way we can expect really good CPUs for our next phone upgrade from Samsung
I wouldn't be surprised if the CPU used on the upcoming SGS2 is the same dual core CPU as the one found in Ipad2
The same was the case in the iPhone 4, original iPad, and the Samsung Galaxy S series of phones.
I'm actually kind of curious what kind of agreements the two have now. The A4/Hummingbird chip was originally created by Intrinsity and Samsung, then Apple acquired Intrinsity. I they probably had shared IP the whole time and are continuing the relationship to bring the same basic chip design to both Apple and Samsung. The chips aren't identical, but they are pretty close. The CPU is the same I believe, but being that it's a SOC, the GPUs and other components aren't necessarily the same.
Are there any detailed information? I wonder if iPad 2 uses Exynos...
d3sm0nd said:
Are there any detailed information? I wonder if iPad 2 uses Exynos...
Click to expand...
Click to collapse
I doubt it. Exynos is the name of the SoC. They are likely using a similar Cortex A9 CPU, but the SoC is likely customized depending on the application. Apple would have had little reason to acquire Intrinsity if they were going to use Samsung's whole package. That's how the A4 and Hummingbird were.
To add a little further proof, Apple is said to be using the SGX543MP GPU in the A5, while we know that the Orion (Exynos 4210) SoC that the SGS 2 will be using is using the Mali 400 GPU.
I'm not sure what Apple's intentions are exactly. They may just be interested in customizing their packages to their specific needs, but get the major parts (CPU, GPU, etc) built by someone else, or they may be in a learning process to completely design their own chips in the future. They certainly have the money to do something like that, but I don't know that they have the interest.
At least that's how I see it all. If anyone else has further insight please let us know.
The SGX543MP4 (used in the sony NGP) is wayyyyyyy better than the mali 400, but you get what you get
Now, the interesting part about the PowerVR is that it is a true MIMD [Multiple Instruction-Multiple Data http://en.wikipedia.org/wiki/MIMD ] architecture. In their press releases, ImgTech is bragging about the capabilities of the "GP-GPU", but even if we take a look at the specifications with the cold head, a lot of surprises are in store. The multi-core design is available in dual, quad, octal and sedec-core variants [SGX543MP2, SGX543MP4, SGX543MP8, SGX543MP16], and they're by no means slouches.
For instance, a quad-core version SGX543MP4 at only 200 MHz frequency delivers 133 million polygons per second and offers fill-rate of four billion pixels per second [4GPixel/s], in the range of GeForce 8600 cards. For that matter, 4GPixel/s runs 40nm GeForce GT210 [2.5 GPixel/s] into the ground. Given that GeForce GT210 runs at 589 MHz for the core and 1.4 GHz for shaders. Since PowerVR SGX543 targets handheld devices, there is no saying what the performance plateau is.
An eight core SGX543MP8 at 200 MHz delivers 266 million polygons and eight billion pixels per second, while faster clocked version, for instance, at 400 MHz would deliver 532 million polygons and 16 billion pixels per second. 16 billion pixels per second equal GeForce GTX 260-216, for instance.
After analyzing the performance at hand, it is no wonder that Sony chose to go with PowerVR for the next-generation PlayStation Portable. While the exact details of the SoC are still in question, our take is that Sony could go with quad-core setup at 400MHz [8GPixel/s], paired with a dual-core CPU based on ARM Cortex architecture. This would put Sony direct in line against Tegra-powered Nintendo DS2, PowerVR-based Apple's iPhone 4G and Palm Pre2.
Click to expand...
Click to collapse
ryude said:
The SGX543MP4 (used in the sony NGP) is wayyyyyyy better than the mali 400, but you get what you get
Click to expand...
Click to collapse
The source of this is information is what exactly...?
martino2k6 said:
The source of this is information is what exactly...?
Click to expand...
Click to collapse
The mali 400 specs and performance figures have already been revealed, as well as the SGX543MP4. Benchmarks also favor the PowerVR.
Strange, so I guess that this disproves the other articles that have stated that Apple has had the Taiwanese company TSMC develop the chips for them.
Sent from my Nexus S
Carne_Asada_Fries said:
Strange, so I guess that this disproves the other articles that have stated that Apple has had the Taiwanese company TSMC develop the chips for them.
Sent from my Nexus S
Click to expand...
Click to collapse
The proof is solid and indeed disproves those other articles.
d3sm0nd said:
Are there any detailed information? I wonder if iPad 2 uses Exynos...
Click to expand...
Click to collapse
The GPU is different in Ipad 2, Ipad 2 has PowerVR SGX543MP2 (I think MP2 means 2 cores) according to Anandtech.
http://www.anandtech.com/Show/Index...rmance-explored-powervr-sgx543mp2-benchmarked
ryude said:
The mali 400 specs and performance figures have already been revealed, as well as the SGX543MP4. Benchmarks also favor the PowerVR.
Click to expand...
Click to collapse
iPad has the MP2 variant, which has two cores. The Mali-400 has 4 cores. I mean, this doesn't mean much but personally I think it's still in the air until someone does proper benchmarks with optimised drivers on a final release model.
martino2k6 said:
iPad has the MP2 variant, which has two cores. The Mali-400 has 4 cores. I mean, this doesn't mean much but personally I think it's still in the air until someone does proper benchmarks with optimised drivers on a final release model.
Click to expand...
Click to collapse
I'll definitely be interested since I just got the iPad 2 and tentatively plan on getting the SGS2. Biggest thing about Android though is that it's so hard to get apps that actually utilize the GPU to it's fullest extent. Apps don't get updated for one top of the line phone while most can't handle it, so in that sense I think we'll see better performance out of the iPad 2. It'll be interesting to see if the Tegra games run on the SGS2 and if they are optimized enough to make good use out of the GPU.
Wouldn't it be possible, with an ipad that is jailbroken to allow dual booting into android since the processor will match that of samsungs mobiles? Generally doesn't the Chooser/firmware discrepancy usually disallow this? If this gap is now filled it would seem doable.
Sent from my SAMSUNG-SGH-I897 using XDA App
crossfire2500 said:
Wouldn't it be possible, with an ipad that is jailbroken to allow dual booting into android since the processor will match that of samsungs mobiles? Generally doesn't the Chooser/firmware discrepancy usually disallow this? If this gap is now filled it would seem doable.
Sent from my SAMSUNG-SGH-I897 using XDA App
Click to expand...
Click to collapse
And why would you want to do that? People buy iDevices for the UX which iOS gives, mainly the multitude of apps and ease of use that it provides. Furthermore, Steve Jobs would chop your head off...
crossfire2500 said:
Wouldn't it be possible, with an ipad that is jailbroken to allow dual booting into android since the processor will match that of samsungs mobiles? Generally doesn't the Chooser/firmware discrepancy usually disallow this? If this gap is now filled it would seem doable.
Sent from my SAMSUNG-SGH-I897 using XDA App
Click to expand...
Click to collapse
The CPU is probably the easiest part. As long as you're an ARM CPU, you can compile support for it. It's the drivers for every other piece of hardware that would be important.

Qualcomm's Dual-core Processors for HTC

Is it true that Qualcomm's dual-core CPU's will be based on the older ARM Cortex-A8 architecture set instead of the modern Cortex-A9 which is being used by Apple's A5 Chip and Nvidia'S Tegra 2 ?
Source:
http://smartphonebenchmarks.com/for...msm8660-12ghz-dual-core-snapdragon-processor/
The hardware benchmarks on the dual-core MSM8x60 1.2 Ghz chip used by HTC Pyramid (Sensation,Doubleshot) and the Evo-3D do not look pretty good.
Source:
http://smartphonebenchmarks.com/forum/index.php?showtopic=258
Need a bit of clarification on this issue why they didn't choose the Cortex-A9 path.
Ok so I just read this report from Qualcomm explaining this issue:
http://www.qualcomm.de/documents/files/linley-report-dual-core-snapdragon.pdf
Apparently their architecture set is compatible with ARM's instruction architecture set and they claim its better than the A9.
"The superscalar CPU uses a 13-stage pipeline to generate faster clock speeds than competing products can achieve using ARM’s Cortex-A8 or Cortex-A9"
Having said that still not sure why the hardware benchmarks are not near the Cortex-A9 dual-core processors.
Adreno-220 is pretty good though compared to other GPU's.
mjehan said:
Apparently their architecture set is compatibily with ARM's instruction architecture set and they claim its better than the A9.
Having said that still not sure why the hardware benchmarks are not near the Cortex-A9 dual-core processors.
Click to expand...
Click to collapse
Because bechmarks are meaningless and HTC have yet to put the work into fiddling them yet!
Quamcomm has been claiming that their design is better than ARM's Cortex A8 before but other than few special occasions, they are mostly equal at the same clock speed. Since MSM8x60 is also based on the identical cores, I don't see how it could be better than Cortex A9. In fact, Qualcomm is working on their own "equivalent to A9" version right now.
FYI, # of pipelines don't tell the whole story about the speed of CPUs. If not implemented well, it will simply cause longer stall delays. We have seen this in the old Pentium 4 architectures.
I think the 128bit fpu makes scorpion equivalent to a9 in floating points calculation
Sent via psychic transmittion.

Galaxy S III Processor Information

Disclaimer:
I make no assertion of fact on any statement I make except where repeated from one of the official linked to documents. If it's in this thread and you can't find it in an official document, feel free to post your corrections complete with relevant link and the OP can be updated to reflect the most correct information. By no means am I the subject matter expert. I am simply a device nerd that loves to read and absorb information on such things and share them with you. The objective of this thread is to inform, not berate, dis-credit, or otherwise talk trash about someone else's choice. Take that to a PM or another thread please.
There is a LOT of misconception in the community over what hardware is the more capable kit. They are not the same. Therefore comparing them in such a way can be difficult at best. The Ti White Sheet speaks to the many aspects of attempting to do such a thing. It is no small undertaking. Therefore I ask you trust their data before my opinion. However, I felt it necessary to have something resembling a one-stop thread to go to when you are wondering about how the hardware differs between the two devices.
One thing you won't see me doing is using pointless synthetic benchmarks to justify a purchase or position. I use my device heavily but not for games. Web browsing, multiple email, etc......I use LTE heavily. A device can get a bajillion points in but that doesn't make me any more productive. Those into gaming will probably be more comfortable with the EU Exynos Quad, I'll just say that up front....it has a stronger GPU, but this isn't about that. It would be nice to keep this thread about the technology, not synthetic benchmark scores.
Dictionary of Terms (within thread scope):
SGSIII: Samsung Galaxy S 3 smartphone, variant notwithstanding
Samsung: manufacturer, proprietor of the Galaxy S III smartphone. Also responsible for designing the Exynos cpu used in the International variant of the SGSIII.
ARM: Processor Intellectual Property Company, they essentially own the IP rights to the ARM architecture. The ARMv7 architecture is what many processors are based upon at the root, this includes the Exynos by Samsung and the Krait S4 by Qualcomm, as used in the SGSIII as well as many others. It's like the basic foundation with the A9 and A15 feature sets being "options" that Samsung and Qualcomm add on.
Qualcomm: Like Samsung, they are a manufacturer of processors, their contribution here is the S4 Krait cpu used in the US/Canadian market SGSIII smartphone.
CPU: processor, central processing unit, it's the number crunching heart of your phone, we are interested in two here, Samsung's Exynos and Qualcomm's Krait.
As most everyone knows by now, the EU and US variants of the SGSIII come with two different cpu's in them. The EU has the Samsung Exynos, the US the Qualcomm S4 Krait. One major reason if not the only reason I am aware of is the inability of Exynos to be compatible with LTE radio hardware. Qualcomm's S4 Krait however has the radio built into the package. It's an all in one design where Exynos is a discreet cpu and has to depend on secondary hardware for network connectivity. Obviously there are power implications any time you add additional hardware because of redundancy and typical losses.
However the scope of this thread is to point out some differences between the two very different cpu's so that you, the consumer, can make an educated decision based on more than a popularity contest or the "moar corez is bettar!" stance.
Anyone who is into computers fairly well knows the "core counting" as a determination of performance is risky at best. Just as with the megahertz wars of the 1990's....hopefully by now you all know not every 2Ghz CPU is the same, and not every CPU core is the same. You cannot expect an Intel 2Ghz CPU to perform the same as an AMD 2Ghz CPU. It's all about architecture.
Architecture for the purpose of this thread is limited to the ARMv7 architecture and more specifically the A9 and A15 subsets of the architecture. Each architecture supports certain features and instruction sets. Additionally the internal physical parts of the core vary from one architecture to the next.
A9 is older technology in general while A15 is much newer. Exynos is A9 based, Krait S4 is A15 based. Lets look at the differences.
When looking at the two, one must understand that some of the documentation available is comparing what was available at the time they were written. In most cases A9 info is based on the 40nm manufacturing process. Samsung's Exynos is built using newer 32nm HKMG manufacturing processes. Qualcomm S4 Krait is built on a diferent smaller TSMC 28nm manufacturing process. Generally speaking, the smaller the process, the less heat and power loss. There is also power leakage, etc......not going to get into it because frankly, I haven't read enough to speak to it much. But don't take my word for it.
There is a lot of information out there but here are a few links to good information.
Exynos 32nm Process Info
Qualcomm S4 Krait Architecture Explained
Ti A15 White Papers
ARM Cortex A9 Info
ARM Cortex A15 Info
Samsung Exynos 4412 Whitesheet
Exploring the Design of the A15 Processor
I could link you to all sorts of web benchmarks and such, but to be honest, none of them are really complete and I have not yet found one that can really give a unbiased and apples to apples comparison. As mentioned previously most of them will compare the S4 Krait development hardware to the older 40nm Samsung Exynos hardware......which really doesn't represent what is in the SGSIII smartphones.
Now a few take aways that to me stood out from my own research. If you are unable to read someone's opinion without getting upset please don't read on from here.
The Exynos EU variant that does not support LTE is on paper going to use more power and create more heat due to it simply needing to rely on additional hardware for it's various functions where the S4 Krait has the radio built in. This remains to be seen but battery life would be the biggest implication here. Although Samsung's latest 32nm HKMG process certainly goes a long way towards leveling the playing field.
The Exynos variant is built on older A9 core technology and when comparing feature sets, does not support things such as virtualization. Do you need VT for your phone? Only if the devs create an application for it, but I believe the ability to dual boot different OS'es is much easier done with VT available.
In contrast the S4 Krait core does support this feature. I would like to see about dual booting Windows Phone 8 and Android and I hope having the hardware support and additional ram (EU version has 1GB ram, US has 2GB ram) will help in this area. Actual VT implementation may be limited in usefulness, to be seen.
The S4 Krait/Adreno 225 package supports DirectX 9.3, a requirement for Windows RT/Windows 8 Phone(not sure if required for Phone version). In contrast Exynos Quad/Mali400 does not support DirectX 9.3 and may or may not be able to run Windows RT/Windows 8 Phone as a result. From what I understand Windows Phone 8 may be an option.
Code compiled for the A9 derived Exynos has been around for quite some time as opposed to A15 feature code. I really don't know much about this myself, but I would expect that the A15 based solution is going to have much longer legs under it since it supports everything the Exynos Quad does plus some. My expectation is that with time the code will be optimized for the newer A15 architecture better where the Exynos A9 is likely much more mature already. It could be we see a shift in performance as the code catches up to the hardware capabilities. Our wonderful devs will be a huge influence on this and where it goes. I know I would want to develop to take advantage of the A15 feature sets because they are in-addition-to the A9 feature sets that they both support.
My hope is that anyone who is trying to make a good purchasing decision is doing so with some intent. Going with a EU SGSIII when you want to take advantage of LTE data is going to cause you heartache. It cannot and will not work on your LTE network. Likewise, if you live somewhere where LTE doesn't exist or you simply don't care to have that ability, buying the US SGSIII may not be the best choice all things considered. So in some cases you see the CPU might not be the gating item that causes you to choose one way or another.
Todays smartphones are powerful devices. In todays wireless world many times our hardware choice is a two year long commitment, no small thing to some. If you have specific requirements for your handset, you should know you have options. But you should also be able to make an educated decision. The choice is your's, do with it what you will.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
SlimJ87D said:
Click to expand...
Click to collapse
One thing you won't see me doing is using pointless synthetic benchmarks to justify a purchase or position. I use my device heavily but not for games. Web browsing, multiple email, etc......I use LTE heavily. A device can get a bajillion points in <insert your choice of synthetic benchmark> but that doesn't make me any more productive. Those into gaming will probably be more comfortable with the EU Exynos Quad, I'll just say that up front....it has a stronger GPU, but this isn't about that. It would be nice to keep this thread about the technology, not synthetic benchmark scores.
Click to expand...
Click to collapse
This is not a benchmark comparison thread, as simply put in the OP. Please create a synthetic benchmark thread for synthetic benchmark comparisons. Please read the OP before commenting. I was really hoping you were going to offer more technical information to contribute as you seem to be up to date on things. I expected more than a cut and paste "me too" synthetic benchmark from you....congrats, you can now run Antutu faster....
Thanks for info but Qualcomm's architecture is not quite following ARM's blueprint/guideline. They did huge modification on their first AP(Snapdragon 1) to push over 1GHz and it causes low power efficiency/application compatibility/heat issue compared to Sammy's legit 1Ghz Hummingbird. And for some reason Qualcomm decide to improving their mal-functioning architecture(scorpion) instead of throwing it away and their inferiority continues through all scorpion chips regardless of generation. Their only sales point and benefit was one less chip solution, and LTE band chip nowadays.
Personally I don't think S4 based on A15 architecture and it is slower than International note's exynos in many comparing benchmarks/reviews.
Exynos in GS3 is made on 32nm node, which is better than 45nm one in note. I don't think Sammy figured out yet the ideal scheduler that android system and applications to use those four core efficiently, but it will show significant performance boost over coming updates as shown on GS2 case.
Sent from my SAMSUNG-SGH-I717 using xda premium
Radukk said:
Thanks for info but Qualcomm's architecture is not quite following ARM's blueprint/guideline. They did huge modification on their first AP(Snapdragon 1) to push over 1GHz and it causes low power efficiency/application compatibility/heat issue compared to Sammy's legit 1Ghz Hummingbird. And for some reason Qualcomm decide to improving their mal-functioning architecture(scorpion) instead of throwing it away and their inferiority continues through all scorpion chips regardless of generation. Their only sales point and benefit was one less chip solution, and LTE band chip nowadays.
Personally I don't think S4 based on A15 architecture and it is slower than International note's exynos in many comparing benchmarks/reviews.
Exynos in GS3 is made on 32nm node, which is better than 45nm one in note. I don't think Sammy figured out yet the ideal scheduler that android system and applications to use those four core efficiently, but it will show significant performance boost over coming updates as shown on GS2 case.
Sent from my SAMSUNG-SGH-I717 using xda premium
Click to expand...
Click to collapse
The fact both cpu's are modified versions of their ARM derived variants is captured in the OP, as is the fact that most if not all comparisons reference the 40nm Exynos as opposed to the newer 32nm process, also mentioned in the OP.
Thanks
Why would windows environment even matter at this moment?
Isn't MS setting the hardware specs for the ARM version of the devices?
As for LTE compatibility, it's getting released in korean market with LTE and 2GB of RAM supposedly and this was the speculation from the beginning.
spesific discussion of the processors is different to general discussion on comparison.
thread cleaned. please keep to this topic?
jamesnmandy said:
When looking at the two, one must understand that some of the documentation available is comparing what was available at the time they were written. In most cases A9 info is based on the 40nm manufacturing process. Samsung's Exynos is built using newer 32nm HKMG manufacturing processes. Qualcomm S4 Krait is built on a newer smaller 28nm manufacturing process. Generally speaking, the smaller the process, the less heat and power a cpu will generate because of the much denser transistor count. There is also power leakage, etc......not going to get into it because frankly, I haven't read enough to speak to it much.
Software written for the A9 derived Exynos has been around for quite some time as opposed to A15 feature code. I really don't know much about this myself, but I would expect that the A15 based solution is going to have much longer legs under it since it supports everything the Exynos Quad does plus some. My expectation is that with time the code will be optimized for the newer A15 architecture better where the Exynos A9 is likely much more mature already. It could be we see a shift in performance as the code catches up to the hardware capabilities. Our wonderful devs will be a huge influence on this and where it goes. I know I would want to develop to take advantage of the A15 feature sets because they are in-addition-to the A9 feature sets that they both support.
Click to expand...
Click to collapse
First of all, Samsung's 32nm HKMG process is superior and more power efficient than TSMC's 28nm that's being used in Krait right now, even if the feature size is slightly bigger. HKMG overall is absolutely big step in transistor leakage, and on top of that Samsung integrated a lot of low level electrical engineering tricks to lower the power usage as load biasing. The quadcore with last generation architecture is toe-to-toe with the dual-core Kraits in maximum power dissipation if you normalize for battery size as per Swedroid's tests: http://www.swedroid.se/samsung-galaxy-s-iii-recension/#batteri
And the Kraits and A15 are supposed to be much more power efficient in W/MHz, so it just proves how much of a manufacturing advantage Samsung has here.
Secondly, that paragraph about software written for A15 is absolutely hogwash. You don't write software for A15, the compiler translates it into instruction code which the A15 is equal to an A9, i.e. ARMv7-A. The only difference is the SIMD length which is being doubled/quadrupled, again something you don't really program for in most cases. A15 is mostly IPC improvements and efficiency improvements, not a new instruction set to warrant difference in software, else nothing would be compatible.
DX9.3 compliance is senseless in that regard too as until we'll need any of that the new generation of SoCs will be out so I don't know why we're even bringing this up as nobody's going to hack Windows 8 into their Galaxy S3.
I really don't see the point of this thread as half of it is misleading and the other half has no objective.
AndreiLux said:
First of all, Samsung's 32nm HKMG process is superior and more power efficient than TSMC's 28nm that's being used in Krait right now, even if the feature size is slightly bigger. HKMG overall is absolutely big step in transistor leakage, and on top of that Samsung integrated a lot of low level electrical engineering tricks to lower the power usage as load biasing. The quadcore with last generation architecture is toe-to-toe with the dual-core Kraits in maximum power dissipation if you normalize for battery size as per Swedroid's tests: http://www.swedroid.se/samsung-galaxy-s-iii-recension/#batteri
And the Kraits and A15 are supposed to be much more power efficient in W/MHz, so it just proves how much of a manufacturing advantage Samsung has here.
Secondly, that paragraph about software written for A15 is absolutely hogwash. You don't write software for A15, the compiler translates it into instruction code which the A15 is equal to an A9, i.e. ARMv7-A. The only difference is the SIMD length which is being doubled/quadrupled, again something you don't really program for in most cases. A15 is mostly IPC improvements and efficiency improvements, not a new instruction set to warrant difference in software, else nothing would be compatible.
DX9.3 compliance is senseless in that regard too as until we'll need any of that the new generation of SoCs will be out so I don't know why we're even bringing this up as nobody's going to hack Windows 8 into their Galaxy S3.
I really don't see the point of this thread as half of it is misleading and the other half has no objective.
Click to expand...
Click to collapse
So i am happy to make corrections when unbiased data is presented. I will look into some of your claims for myself and update accordingly but as mentioned in the OP if you would like to cite specific sources for any thing, please include links. Thank you for your input. The entire point of the thread is to document the differences because a lot of people seem to be looking at the choice as simply 4 or 2 cores and in similar fashion they gravitate to the bigger number without understanding what they are buying into. Some of your statements claim "hogwash", as mentioned I am learning myself and hope to rid the post of any hogwash asap. I for one will be trying to get Windows 8 Phone to boot on it if possible, I tried to clarify in the OP Windows Phone 8 while Windows 8 RT certainly looks to be a stretch. Thanks
Sent from my DROIDX using xda premium
AndreiLux said:
First of all, Samsung's 32nm HKMG process is superior and more power efficient than TSMC's 28nm that's being used in Krait right now, even if the feature size is slightly bigger. HKMG overall is absolutely big step in transistor leakage, and on top of that Samsung integrated a lot of low level electrical engineering tricks to lower the power usage as load biasing. The quadcore with last generation architecture is toe-to-toe with the dual-core Kraits in maximum power dissipation if you normalize for battery size as per Swedroid's tests: http://www.swedroid.se/samsung-galaxy-s-iii-recension/#batteri
And the Kraits and A15 are supposed to be much more power efficient in W/MHz, so it just proves how much of a manufacturing advantage Samsung has here.
Secondly, that paragraph about software written for A15 is absolutely hogwash. You don't write software for A15, the compiler translates it into instruction code which the A15 is equal to an A9, i.e. ARMv7-A. The only difference is the SIMD length which is being doubled/quadrupled, again something you don't really program for in most cases. A15 is mostly IPC improvements and efficiency improvements, not a new instruction set to warrant difference in software, else nothing would be compatible.
DX9.3 compliance is senseless in that regard too as until we'll need any of that the new generation of SoCs will be out so I don't know why we're even bringing this up as nobody's going to hack Windows 8 into their Galaxy S3.
I really don't see the point of this thread as half of it is misleading and the other half has no objective.
Click to expand...
Click to collapse
Which test on that review shows the 32nm 4412 being more efficient than the 28nm 8260 on the one s?
VT has nothing to do with dualbooting. That only requires a bootloader and of course the operating systems supporting sharing their space.
It might allow with a lot of work to get Android to run under WP8 or WP8 to run under Android in a virtual machine.
The most interesting feature you could achieve with VT is to have two copies of the same operating system running with their own data, cache and storage partitions each. This would allow corporta BYOD to remain more-or-less secure and enforce corporate policies on Exchange clients without requiring the user's private part of the phone to be affected by these restrictions.
However after years of development 3d performance on x86 (desktop) platforms is mediocre at best with imho Microsoft Hyper-V being the current winner.
Additionally you claim that WP8 will work on the Qualcom chip.
This is simply not true: it MIGHT work if a build for that exact SoC is produced (somewhat unlikely).
Since WP8 is closed-source and the hardware is proprietary binary too it won't be portable.
This is due to ARM not being like x86 a platform with extensible capability and plg&play support but rather an embedded system where the software has to be developed for each device individually.
nativestranger said:
Which test on that review shows the 32nm 4412 being more efficient than the 28nm 8260 on the one s?
Click to expand...
Click to collapse
You have the full load test and the temperature, in the link that I posted. Normalize them for battery size, for example to the Asus Padphone (Or the One S for that matter, they similar in their result) at 3.7V*1520mAh = 6.1Wh and the S3 at 3.8V*2100mAh = 7.98Wh >> 30.8% increase. Nomarlize the S3's 196 minutes by that and you get 149 minutes. Take into account how the S3's screen is bigger and higher resolution and the result will be more skewed towards the S3. So basically a four core last generation at full load on all four cores is arguably toe-to-toe in maximum power dissipation to a next-generation dual core. The latter should have been the winner here by a large margin, but it is not. We know it's not due to architectural reasons, so the only thing left is manufacturing. HKMG brings enormous benefits in terms of leakage and here you can see them.
d4fseeker said:
VT has nothing to do with dualbooting. That only requires a bootloader and of course the operating systems supporting sharing their space.
It might allow with a lot of work to get Android to run under WP8 or WP8 to run under Android in a virtual machine.
The most interesting feature you could achieve with VT is to have two copies of the same operating system running with their own data, cache and storage partitions each. This would allow corporta BYOD to remain more-or-less secure and enforce corporate policies on Exchange clients without requiring the user's private part of the phone to be affected by these restrictions.
However after years of development 3d performance on x86 (desktop) platforms is mediocre at best with imho Microsoft Hyper-V being the current winner.
Additionally you claim that WP8 will work on the Qualcom chip.
This is simply not true: it MIGHT work if a build for that exact SoC is produced (somewhat unlikely).
Since WP8 is closed-source and the hardware is proprietary binary too it won't be portable.
This is due to ARM not being like x86 a platform with extensible capability and plg&play support but rather an embedded system where the software has to be developed for each device individually.
Click to expand...
Click to collapse
Changed text to read
From what I understand Windows Phone 8 may be an option.
Click to expand...
Click to collapse
and
Actual VT implementation may be limited in usefulness, to be seen.
Click to expand...
Click to collapse
TSMC is struggling with their 28nm node and failed to bring up yield rate through High-K Metal Gate process, so they announced they will keep 28nm SiON for now. The problem is when node become more dense, voltage leakage rate increases geometrically. HKMG itself reduces that leakage to about 100 times less as stated by Global Foundry and Samsung. That is why 32nm HKMG is way more superior than 28nm SiON. You can easily find related article, PR data and detailed chart about this.
Sent from my SAMSUNG-SGH-I717 using xda premium
Radukk said:
TSMC is struggling with their 28nm node and failed to bring up yield rate through High-K Metal Gate process, so they announced they will keep 28nm SiON for now. The problem is when node become more dense, voltage leakage rate increases geometrically. HKMG itself reduces that leakage to about 100 times less as stated by Global Foundry and Samsung. That is why 32nm HKMG is way more superior than 28nm SiON. You can easily find related article, PR data and detailed chart about this.
Sent from my SAMSUNG-SGH-I717 using xda premium
Click to expand...
Click to collapse
Will update with linkage when i can, kinda busy with work, if you have links to share please do,
Sent from my DROIDX using xda premium
jamesnmandy said:
Will update with linkage when i can, kinda busy with work, if you have links to share please do,
Sent from my DROIDX using xda premium
Click to expand...
Click to collapse
http://en.wikipedia.org/wiki/High-k_dielectric
http://www.chipworks.com/en/techni...resents-dualquad-core-32-nm-exynos-processor/
Sent from my SAMSUNG-SGH-I717 using xda premium
Radukk said:
http://en.wikipedia.org/wiki/High-k_dielectric
http://www.chipworks.com/en/techni...resents-dualquad-core-32-nm-exynos-processor/
Sent from my SAMSUNG-SGH-I717 using xda premium
Click to expand...
Click to collapse
Interesting reading. Thanks! :thumbup:
Sent from my GT-I9300 using xda premium
Radukk said:
http://en.wikipedia.org/wiki/High-k_dielectric
http://www.chipworks.com/en/techni...resents-dualquad-core-32-nm-exynos-processor/
Sent from my SAMSUNG-SGH-I717 using xda premium
Click to expand...
Click to collapse
For future reference I would never use wikipedia as a source of fact. Thanks for the other link, will update as soon as i can.
Sent from my DROIDX using xda premium
jamesnmandy said:
For future reference I would never use wikipedia as a source of fact. Thanks for the other link, will update as soon as i can.
Sent from my DROIDX using xda premium
Click to expand...
Click to collapse
You know, there's always the sources at the bottom of every Wikipedia article...
AndreiLux said:
You know, there's always the sources at the bottom of every Wikipedia article...
Click to expand...
Click to collapse
you are of course correct, which is why I always drill down and link to the sources not the article, just personal preference I suppose, but this isn't my idea, I think linking to wikipedia as a source of fact is generally frowned upon
no worries

[SAMSUNG] To Unveil [8-CORE] ARM Chip

Eight cores, in a mobile processor? Balderdash! But according to EETimes, that's just what Samsung's planning on unveiling in February at the International Solid-State Circuits Conference (that sounds so exciting).
Now before you get too excited, this isn't - technically speaking - an eight-core processor. It's a dual quad-core, which is to say, a two-processor chip. The design is based on a reference architecture thought up by ARM themselves, dubbed "big.little," and is designed to combine the light-load battery life of a high-efficiency quad-core 28nm ARM A7 chip with a super-hi-po A15 processor for heavy lifting. The exact specifications, for our nerdier readers, are: 1 quad-core ARM A7 chip clocked at 1.2GHz for everyday tasks, and 1 quad-core ARM A15 chip clocked at 1.8GHz w/ 2MB L2 cache for processor-intensive tasks like video games.
ARM itself has said the "big.little" project is delivering benefits beyond those expected when the architecture was initially announced, and Samsung's chip should be the first on the market based on the concept. So yes, this will be a new Exynos of some sort.
Should you expect this chip in the Galaxy S IV (or whatever Samsung's going to call it - because that's far from a given)? It's possible, but not necessarily likely. The gap between chip announcement and tape-out (mass-production readiness) can be lengthy. With the first batch of Exynos 5 Dual devices just now hitting the market in the form of the new Samsung Chromebook and Nexus 10, this eight-core beast may not be ready in time for the next "next big thing." Samsung could very well specifically be targeting this chip for Chromebooks and Windows RT / Android tablets before taking a dive into smaller form factors, too.
Either way, it's exciting business - I can't say I ever tire of technology getting faster.
Click to expand...
Click to collapse
to be honest lately i have started to lose interest in Samsung due to the whole exynos issue and lack of support for developers but if this is to be true then i feel comfortable in making my next device a Samsung (only with this chip ovcourse) lets hope we see this chip come to more devices if it is infact released we will have to wait and see what samsung brings us in 2013 to decide if our loyalty to samsung is acctually worth it
courtesy of android police
What relavance to the S3 does this have??? Nothing. Keep it in the General section not the S3 General section...
Sent from my GT-I9300 using xda premium
Not excited. Most apps today are optimized for single and dual cores, rarely on quad. And now octal??
Well if you read its not a true 8 core processor. It is two CPU's both quad core. One being an A15 @1.8GHz and the other an A7 @1.2GHz
Sent from my SGH-I897 using xda premium
Yes and Intel working on a 48 core chip.
Also it doesn't matter if the application you are running isn't designed for multi cores, most of the time this isn't even possible. People are still forgetting that one application isn't EVERYTHING that runs on a CPU, there are a lot of processes that run at the same time and thus benefit from multi core architecture.
Someone please move this useless thread.
Sent from my GT-I9300 using xda app-developers app
b-eock said:
Well if you read its not a true 8 core processor. It is two CPU's both quad core. One being an A15 @1.8GHz and the other an A7 @1.2GHz
Sent from my SGH-I897 using xda premium
Click to expand...
Click to collapse
It's still a octa-core if you want to be anal about the definition, one of the hacks in kernels with this device SoC will be to run all cores in asymmetric multiprocessor modes.
But anyway the timing coincides with the 5450 rumors we've been hearing. Either they have two discrete quad A15 SoCs or they're both the same thing.
AndreiLux said:
It's still a octa-core if you want to be anal about the definition, one of the hacks in kernels with this device SoC will be to run all cores in asymmetric multiprocessor modes.
But anyway the timing coincides with the 5450 rumors we've been hearing. Either they have two discrete quad A15 SoCs or they're both the same thing.
Click to expand...
Click to collapse
Wow that sounds super exciting. Great piece of info. Now its time for google to really optimize android for multi core processors
《Samsung rom》
---------- Post added at 08:21 PM ---------- Previous post was at 08:19 PM ----------
sfjuocekr said:
Yes and Intel working on a 48 core chip.
Also it doesn't matter if the application you are running isn't designed for multi cores, most of the time this isn't even possible. People are still forgetting that one application isn't EVERYTHING that runs on a CPU, there are a lot of processes that run at the same time and thus benefit from multi core architecture.
Someone please move this useless thread.
Sent from my GT-I9300 using xda app-developers app
Click to expand...
Click to collapse
The 48 core chip intel now intel has started as a multi core gpu , they started designing it as larabee gpu which later turned into the product which is released right now .sadly its not destined for desktop but for high performance computing.
《Samsung rom》

CPU/Processor Showdown - HTC One vs Galaxy S4

Which processow will be better, Exynos 5 Octa or A simple Snapdragon 600 quad?
In my POV, Octa will be useless since it will be a battery hog and no apps really use that much cores and power. The S600 will be more efficient for day-to-day use since it consumes less power and will actually be used.
-.-.-.-.-.-.-.-.-
Sent from a dark and unknown place
Galaxy Tab 2 7.0 P3100
I thought the s4 had the same processor as the One, but it was clocked to 1.9? I could be wrong. I wasn't really paying attention.
Sent from my HTC One using Tapatalk 2
I'd imagine this thread will get closed.
In the meantime, read this thread and then make a judgement because the "it uses more power so it sucks" mentality is just simply incorrect.
[Info] Exynos Octa and why you need to stop the drama about the 8 cores
AndreiLux said:
Misconception #1: Samsung didn't design this, ARM did. This is not some stupid marketing gimmick.
Misconception #2: You DON'T need to have all 8 cores online, actually, only maximum 4 cores will ever be online at the same time.
Misconception #3: If the workload is thread-light, just as we did hot-plugging on previous CPUs, big.LITTLE pairs will simply remain offline under such light loads. There is no wasted power with power-gating.
Misconception #4: As mentioned, each pair can switch independently of other pairs. It's not he whole cluster who switches between A15 and A7 cores. You can have only a single A15 online, together with two A7's, while the fourth pair is completely offline.
Misconception #5: The two clusters have their own frequency planes. This means A15 cores all run on one frequency while the A7 cores can be running on another. However, inside of the frequency planes, all cores run at the same frequency, meaning there is only one frequency for all cores of a type at a time.
Click to expand...
Click to collapse
Addition: I am not a Samsung fanboy by any means, however, the amount of incorrect information floating around about both of these flagships is starting to get annoying.
2nd addition: Read this as well, the big.LITTLE technology being used in the Octa is pretty amazing: big.LITTLE Processing
I hope that the overclocking or higher clock rate doesn't produce Moment-esque results.
Alsybub said:
I thought the s4 had the same processor as the One, but it was clocked to 1.9? I could be wrong. I wasn't really paying attention.
Sent from my HTC One using Tapatalk 2
Click to expand...
Click to collapse
In the US that is true, they are both S600's, with the S4 having a .2ghz higher clockspeed. Many of the other S4's will have the Octa Exynos chip.
crawlgsx said:
In the US that is true, they are both S600's, with the S4 having a .2ghz higher clockspeed. Many of the other S4's will have the Octa Exynos chip.
Click to expand...
Click to collapse
Ah. I see. Different hardware for different regions. Like the One X.
Even though it's eight cores it is probably complete overkill. Yet another bigger number to put on marketing. How many apps will actually use that? How many apps use four cores at the moment?
There have been some articles about multiple cores being more for point of sale than for the end user. Even if you're signing up for a contract right now I doubt that much would be making use of it in two years time. So, the future proofing argument is moot.
It'll be interesting to see. Of course the galaxy builds of Android will use the cores. With things like the stay awake feature and pip it is useful. Outside of the OS I can't see it being necessary.
Sent from my Transformer Prime TF201 using Tapatalk HD
The "octa" core processor is complete bullsh*t. Imo, 2/4 cores are perfectly fine as long as they optimize it and perfect the hardware, why stack 8 cores when only 4 work at one time and no app will use all that power.
They should've focused on design to make it look less like a toy phone and use better finish, instead.
Oh the marketing..
Not HTC or whatever fanboy, just stating my opinion.
rotchcrocket04 said:
I'd imagine this thread will get closed.
In the meantime, read this thread and then make a judgement because the "it uses more power so it sucks" mentality is just simply incorrect.
[Info] Exynos Octa and why you need to stop the drama about the 8 cores
Addition: I am not a Samsung fanboy by any means, however, the amount of incorrect information floating around about both of these flagships is starting to get annoying.
2nd addition: Read this as well, the big.LITTLE technology being used in the Octa is pretty amazing: big.LITTLE Processing
Click to expand...
Click to collapse
Very good read, thanks for taking the time to post it. Surprised no-one has mentioned that we need this in our Ones. Would certainly help with the battery.
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Nekromantik said:
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Click to expand...
Click to collapse
Benchmarks show adreno320 keeps up nicely. You won't see any real world differences besides a slightly lower benchmark score
http://forum.xda-developers.com/showthread.php?t=2191834
Sent from my ADR6425LVW using xda app-developers app
Squirrel1620 said:
Benchmarks show adreno320 keeps up nicely. You won't see any real world differences besides a slightly lower benchmark score
http://forum.xda-developers.com/showthread.php?t=2191834
Sent from my ADR6425LVW using xda app-developers app
Click to expand...
Click to collapse
Those are from the S600 version.
Higher clock speed and Android 4.2 will mean its slightly ahead.
No benchmarks from the Octa version yet.
Nekromantik said:
Those are from the S600 version.
Higher clock speed and Android 4.2 will mean its slightly ahead.
No benchmarks from the Octa version yet.
Click to expand...
Click to collapse
I'll just stick with the one and wait for the 4.2 update. By then we should have custom kernels to overclock ourselves
Sent from my ADR6425LVW using xda app-developers app
Here you go
Nekromantik said:
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Click to expand...
Click to collapse
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
hung2900 said:
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
Click to expand...
Click to collapse
How do you know all 8 can run at the same time? Has Samsung demonstrated that already? Any links?
Also what would be the speed if all 8 are running at the same time?
Also did you see that an Intel dual core @2GHz beat the Exynos Octa in benchmarks!!! So all 8 cores running at slower speed might not be very good actually. It might even slow down things even more...
We recently demonstrated a dual core running at 3GHz at MWC in Barcelona. That chip was able to load games at crazy speeds. A game that took 15s to load on existing Exynos Quad core was loading in just 6s with our chip!
joslicx said:
We recently demonstrated a dual core running at 3GHz at MWC in Barcelona. That chip was able to load games at crazy speeds. A game that took 15s to load on existing Exynos Quad core was loading in just 6s with our chip!
Click to expand...
Click to collapse
. And used 3 times the energy to do it... Was that tested at all?
backfromthestorm said:
. And used 3 times the energy to do it... Was that tested at all?
Click to expand...
Click to collapse
Its all about bragging rights really. Same as Samsung is doing with regards to Octa.
The the chip that could run at 3GHz could also very well run at 1GHz at just 0.6V (so consuming far lesser power than anything else in the market). A dual core at 1GHz is still good enough for all mundane tasks like playing videos or internet browsing etc. So in practice it would have been a very efficient solution. It was a real innovation really. Sadly the company did not have money to pour more funds into the program and has shut it.
It was demonstrated at Mobile World Congress in Barcelona in february this year.
Anyway point is, we did not need extra set of power efficient cores like Samsung is doing. We ran the same cores that could do crazy high speeds and even crazier power efficient mode! Thats a very neat solution.
Heres a press link: http://www.itproportal.com/2013/02/25/mwc-2013-exclusive-dual-core-st-ericsson-novathor-l8580-soc-crushes-competition-benchmarks/
To quote the article:
A continuous running test monitored by an infra-red reader showed that the 3GHz prototype smartphone remained cooler as it uses less energy and in some scenarios, it could add up to five hours battery life in a normal usage scenario
Click to expand...
Click to collapse
hung2900 said:
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
Click to expand...
Click to collapse
Actually, no. At least not in my opinion. Octacore means 8 cpu cores on one cpu-chip.
I would see it like this:
You have 2 houses on your lawn which are beside each other. Every house has 4 rooms. You have to switch houses to open up the rooms. Just like the Exynos "Octa" has to, since it cannot run both CPU's at the same time.
If you are in a house with 8 rooms, you cannot simply be in all 8 rooms at once. You can connect the open doors between all the rooms, and since your in that house, you can freely walk in every room. But not with that implementation.
I wouldn't call the Exynos "Octa" an Octacore, its a dual CPU system with a 2x4 cores, with the difference that regular desktop dual CPU systems can use both CPU units at once, but not like the Exynos "Octa". Still, dual quad system comes closer than a pure octacore system.
This is kind of a hybrid. Nice technology for a mobile device, but at the same time, kind of unneeded / inefficient, compared to regular quadcore systems. Even the Tegra 3 system with 4 active cores and 1 companion core for standby tasks seems more efficient (in terms of "used space" and ressources).
Ah well let's see how the supposed and so called "octacore" will score in the future...
processor differences
okay I know both processor are snapdragon 600's but why is the galaxy S4's processor clocked at 1.9 ghz and the HTC One's processor is clocked at 1.7 ghz is it just an instance of samsung overclocking the s600 or are they different variations of the same processor, I have done some research and am able to find no clear answer to this question even on the snapdragon website????????
dawg00201 said:
okay I know both processor are snapdragon 600's but why is the galaxy S4's processor clocked at 1.9 ghz and the HTC One's processor is clocked at 1.7 ghz is it just an instance of samsung overclocking the s600 or are they different variations of the same processor, I have done some research and am able to find no clear answer to this question even on the snapdragon website????????
Click to expand...
Click to collapse
They should be identical. I think its just a manufacturer choice. But it could also be associated to termals or battery.
Cause Samsung took the higher frequency chips, there is the possibility that they also get the "better" chips: Lower Voltage for the same frequency. But thats just an assumption.

Categories

Resources