Samsung Exynos 5250 - Arndale development board. - Android Software/Hacking General [Developers Only]

For all those interested in developing for the Exynos 5250, to be used in the Nexus 10, Samsung have kindly launched, for a modest sum, the Arndale development board.
http://www.arndaleboard.org/wiki/index.php/Main_Page
It has already been benchmarked on the GL Benchmark site, Mali T-604 is powerful, but it doesn't look like it will give the A6X any headaches.
http://www.glbenchmark.com/phonedet...o25&D=Samsung+Arndale+Board&testgroup=overall

Need proper benchmark done on a final device. Definitely can't think that dev board drivers are optimized properly. It's running on 4.0.4. We should get more details once we do a benchmark on a final version of N10.

hot_spare said:
Need proper benchmark done on a final device. Definitely can't think that dev board drivers are optimized properly. It's running on 4.0.4. We should get more details once we do a benchmark on a final version of N10.
Click to expand...
Click to collapse
Jelly Bean didn't do much for graphics benchmarks, IRC. The low-level test won't change much, if you do a comparison with the iPad 3, you can see that Power VR SGX MP4 is a beast in terms of pixel / texture fill rate, which the A6X will improve further. The consensus is that shader power is the most important, as long as there is sufficient fill rate performance, and the Mali T-604 combined with its good bandwidth should be as capable as the A6X in real world games, the only question will developers optimise a game just 1 tablet?

Turbotab said:
Jelly Bean didn't do much for graphics benchmarks, IRC. The low-level test won't change much, if you do a comparison with the iPad 3, you can see that Power VR SGX MP4 is a beast in terms of pixel / texture fill rate, which the A6X will improve further. The consensus is that shader power is the most important, as long as there is sufficient fill rate performance, and the Mali T-604 combined with its good bandwidth should be as capable as the A6X in real world games, the only question will developers optimise a game just 1 tablet?
Click to expand...
Click to collapse
I am not saying that JB will suddenly improve GPU benchmarks, but a lot of improvement can happen due to driver/firmware optimization.
Let me give you real example: Do you recall GLbenchmark Egypt offscreen scores GS2 when it came out initially? It was getting around 40-42fps initially.
[Source: http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17 ]
The same GS2 after a few months was getting 60-65fps under same test.
Source 1: http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
Source 2: http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
It's a clear 50% improvement in performance done primarily through driver optimization.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Also check this slide : http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Samsung says 2.1 GPixels/s @ GPU clocked at 533MHz. Obviously the results don't match with quoted numbers. Difference is a lot actually.
I believe the final Nexus 10 numbers will be quite different from what we see now. Let's wait for final production models.

Related

[Q] Benchmark Scores

So we all know the Nexus S has a 1Ghz Cortex A8 Hummingbird CPU, which sounds unimpressive considering the Nexus One has a 1Ghz Snapdragon QSD 8250, but it's a known fact that clock speed often has little to do with actual computational power. Qualitative previews have said that the Nexus S "flies," but I'd like to see something more in the numbers. If anyone has a demo device, could you run a few benchmarks? Or perhaps comment on performance after quick opening/closing several computationally intensive applications?
QuacoreZX said:
So we all know the Nexus S has a 1Ghz Cortex A8 Hummingbird CPU, which sounds unimpressive considering the Nexus One has a 1Ghz Snapdragon QSD 8250, but it's a known fact that clock speed often has little to do with actual computational power. Qualitative previews have said that the Nexus S "flies," but I'd like to see something more in the numbers. If anyone has a demo device, could you run a few benchmarks? Or perhaps comment on performance after quick opening/closing several computationally intensive applications?
Click to expand...
Click to collapse
1gb humingbird is fast as galaxy S and iphone 4. both which are like 30% or more faster then snapdragon
I think the key improvement is in graphics performance. Here is a comparison.
QuacoreZX said:
So we all know the Nexus S has a 1Ghz Cortex A8 Hummingbird CPU, which sounds unimpressive considering the Nexus One has a 1Ghz Snapdragon QSD 8250, but it's a known fact that clock speed often has little to do with actual computational power. Qualitative previews have said that the Nexus S "flies," but I'd like to see something more in the numbers. If anyone has a demo device, could you run a few benchmarks? Or perhaps comment on performance after quick opening/closing several computationally intensive applications?
Click to expand...
Click to collapse
The S5PC11x (Hummingbird) has 2x the memory bandwidth of the MSM8250.
The MSM8250 gets about 2x the floating point performance of the S5PC11x.
I believe the SGX540 GPU in S5PC11x is on the whole a bit faster than the GPU in the 8250, but I don't have hard numbers on that in front of me. They're architecturally different GPUs and will have different strengths and weaknesses.
It's really hard to do a good apples to apples comparison of different SoCs -- memory interconnect, cache sizes, ARM architecture version, GPU, etc, etc all play into overall system performance.
Gingerbread, overall, tends to be faster than Froyo on the same hardware.
Not really too familiar with this stuff, but will the JIT compiler being optimized for snapdragon instruction set make a huge difference still? My Vibrant plays games way better than the MT4G (imo) but scores terribly on Linpack and is terribly slow at opening applications and things vs. the MT4G.
Read the post above you. Linpack is mainly a benchmark for numerical performance(floating point etc), where the Snapdragon chips are MUCH better.
But the Hummingbird(PowerVR) GPU is better than the Adreno GPU found in the Snapdragon line. That's why the gaming performance of your Vibrant is better than the MT4G.
Ronaldo_9 said:
1gb humingbird is fast as galaxy S and iphone 4. both which are like 30% or more faster then snapdragon
Click to expand...
Click to collapse
PhoenixFx said:
I think the key improvement is in graphics performance. Here is a comparison.
Click to expand...
Click to collapse
Yup, just anecdotally, hummingbird is MUCH faster than snapdragon IMHO
galaxyS/NS SGX540= 90 million triangles/sec
HTC G2 Adreno 205 =44 million triangles/sec
Nexus one = Adreno 200 = 22 million triangles/sec
nexus S is running on the fastest GPU out now. And another good thing about running on power VR GPU is that iphone runs on one also so when lazy iphone porting happens you will have optimal performance running on that GPU than you would on Adreno
Ive noticed this especially on gameloft games
Trust me im on a vibrant and came from nexus one with out a doubt the nexus S GPU smokes nexus one GPU even out performance 2nd gen snapdragon
Hummingbird > all atm.
Orion will be the same.
Don't make pre-assumptions about the dual core chips.. Orion has good competition from the TI OMAPS line.. Qualcomm looks like they'll stay behind GPU wise though.
Plus the Sound Quality of the Hummingbird chip is awesome. MUCH better than the Snapdragon chips.
Also, you have to be cautious of manufacturer specs for GPU pixels/sec and triangles/sec -- the "box numbers" are always under optimal conditions and often not representative of real workloads.
For modern non-fixed-pipe GPUs (gl ES 2.x, etc) compute capabilities (how many shader ops / pixel/ etc you can get away with) factor in as well.
Depending on what your workload is like (geometry heavy? fill heavy? texture heavy? shader heavy?) you will see different strengths and weaknesses when comparing GPUs.
All that said, the SGX540 is indeed quite snappy.
chip
I agree the sound chip is good in the NS, as is the GPU

[INFO] Mali-400MP GPU vs Adreno 220 GPU

Mali-400 MP is a GPU (Graphics Processing Unit) developed by ARM in 2008. Mali-400 MP supports a wide range of use from mobile user interfaces to smartbooks, HDTV and mobile gaming. Adreno 220 is a GPU developed by Qualcomm in 2011 and it is a component of the MSM8260 / MSM8660 SoC (System-on-Chip) powering the upcoming HTC EVO 3D, HTC Pyramid and Palm’s TouchPad tablets.
Mali™-400 MP
Mali™-400 MP is the world’s first OpenGL ES 2.0 conformant multi-core GPU. It provides support for vector graphics through OpenVG 1.1 and 3D graphics through OpenGL ES 1.1 and 2.0, thus provides a complete graphics acceleration platform based on open standards. Mali-400 MP is scalable from 1 to 4 cores. It also provides the AMBA® AXI interface industry standard, which makes the integration of Mali-400 MP into SoC designs straight-forward. This also provides a well-defined interface for connecting Mali-400 MP to other bus architectures. Further, Mali-400 MP has a fully programmable architecture that provides high performance support for both shader-based and fixed-function graphics APIs. Mali-400 MP has a single driver stack for all multi-core configurations, which simplifies application porting, system integration and maintenance. Features provided by Mali-400 MP includes advanced tile-based deferred rendering and local buffering of intermediate pixel states that reduces memory bandwidth overhead and power consumption, efficient alpha blending of multiple layers in hardware and Full Scene Anti-Aliasing (FSAA) using rotated grid multi sampling that improves the graphics quality and performance.
Adreno 220
In 2011 Qualcomm introduced Adreno 220 GPU and it is a component of their MSM8260 /MSM8660 SoC. Adreno 220 supports console-quality 3D graphics and high-end effects such as vertex skinning, full-screen post-processing shader effects, dynamic lighting with full-screen alpha blending, real-time cloth simulation, advanced shader effects like dynamic shadows, god rays, bump mapping, reflections, etc and 3D animated textures. Adreno 220 GPU also claims that it can process 88 million triangles per second and offers twice the processing power of its predecessor Adreno 205. Further, Adreno 220 GPU claims to boost the performance up to a level that is competitive with console gaming systems. Also, Adreno 220 GPU will allow running games, UI, navigation apps and web browser in largest display sizes with lowest power levels.
Difference Between The Two
Difference between Mali-400MP GPU and Adreno 220 GPU
Based on a research done by Qualcomm using an average of Industry benchmarks composed of Neocore, GLBenchmark, 3DMM and Nenamark, they claim that Adreno 220 GPU in Qualcomm’s dual-core Snapdragon MSM8660 offers twice the performance of the GPU in other leading dual-core ARM9-based chips. Also, a team known as Anandtech has done several tests on Adreno 220 GPU. One of them was the GLBenchmark 2.0, which records the performance of OpenGL ES 2.0 compatible devices such as Mali™-400 MP using two long suites that include a combination of different effects such as direct lighting, bump, environment, radiance mapping, soft shadows, texture based on the use of vertex shader, deferred multi-pass rendering, texture noise, etc. and the test showed that Adreno 220 GPU was 2.2 times faster than the other existing devices such as Mali-400 MP GPU.
What do you guys think of this??.... I've been with HTC since they started doing Android... and I have to say android has come along way and so has the hardware....
thanks
thanks, i'be been looking for feedback regarding adreno 220 vs mali 400 GPUs, do you have any link or source to back this up
I am looking forward to buying the sensation, and my only concern is the adreno 220 GPU as and whether it is better, equal, or workse than the Mali 400
If it is better then i'm definitely buying the sensation, if not then i might consider the galaxy s II
Thanks
interesting read... the sgs2 fans will debate this and say that the benchmarks were done with an over clocked processor.. so as of now its all just a good read
i don't know if this video helps but the gpu upscaling and running on a larger screen with higher resolution and not falling back is kind of amazing
http://www.youtube.com/watch?v=RBBMVc9-fuk
boostedb16b said:
i don't know if this video helps but the gpu upscaling and running on a larger screen with higher resolution and not falling back is kind of amazing
http://www.youtube.com/watch?v=RBBMVc9-fuk
Click to expand...
Click to collapse
Nice post! I'm completely impressed. Amazing that that was being up converted from a friggin cell phone to large HDTV!
I'm sold!
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
boostedb16b said:
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
Click to expand...
Click to collapse
Actually they went with the Adreno 205 in the Xperia Play, which is the same gpu as in the Desire HD
Sent from my HTC Desire HD using XDA Premium App
boostedb16b said:
another thing is that sony went with the adreno 220 in the xperia play so i guess it is as good as they say it is
Click to expand...
Click to collapse
Isn't it Adreno 205 ?
http://www.gsmarena.com/sony_ericsson_xperia_play-3608.php
Adreno 220 is faster than Mali-400MP anyday, and compared to Tegra 2 it's better in some cases and worst in others. I didn't get a Galaxy S2 due to the fact that the Mali-400MP is soo antiquetted...
According to reports on the SGS 2 forum, the Mali 400 doesn't support textures? This seems a big blow on the SGS 2's gaming capabilities to me. What would be an interesting comparison would be the Adreno 220 vs the (LG Optimus 3D's) OMAP4 PowerVR SGX400 GPU, which is clocked at 300MHz instead of the 200MHz found in the samsung hummingbird processor....
Ian
Beaker491 said:
According to reports on the SGS 2 forum, the Mali 400 doesn't support textures? This seems a big blow on the SGS 2's gaming capabilities to me. What would be an interesting comparison would be the Adreno 220 vs the (LG Optimus 3D's) OMAP4 PowerVR SGX400 GPU, which is clocked at 300MHz instead of the 200MHz found in the samsung hummingbird processor....
Ian
Click to expand...
Click to collapse
Mali 400 Does support textures its texture compression which it does not support.
Elchemist said:
Mali 400 Does support textures its texture compression which it does not support.
Click to expand...
Click to collapse
It does support texture compression, just not the proprietary formats of other GPU vendors. Developers that only code for proprietary formats get locked in and lose out when something better comes along.
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Mail 400 is just an outdated chip, why someone wouldn't include texture compression in a MOBILE gpu is beyond me. What the hell were they thinking?!! The PS3 only does this because it has a blu-ray sized storage media! Ridiculous.
As stated, it DOES support texture compression. There are just a few formats and it only supports one of those. It won't prove a problem as it seems the SGS2 is going to be extremely popular and all game devs will support it eventually (most do already)
boostedb16b said:
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Click to expand...
Click to collapse
Wow, that was a slight over reaction! Do you work for HTC or something because that was a very defensive reply.
i9100 actually speaks some truth. There is a very informed topic over on the SGS2 forum about this which i suggest you guys read before making any more statements which are incorrect.
Still, looking forward to seeing this phone released to see if it's as impressive as the S2 is. I'm hoping it will be.
boostedb16b said:
i surely was waiting for an idiot response like yours i9100 just face it that samsung made a mistake with the mali... and to be quite honest with you the sgs2 was rushed out to take the lead on the market b4 the sensation.. and the cpu was overclocked to be more competitive with the sensation... and the rush has shown what i expected lots of unknown problems in the phone and then returns for phones with faulty screens and so on so dont be a troll and state facts rather than just talk because you have the device
Click to expand...
Click to collapse
I wouldn't be surprised if that was the case. To Samsung, that's business as usual.
I'm biased and this is not the Samsung home turf so I'll just keep to correcting facts. A little flaming is fine and not unexpected, but I won't go there. I'm enjoying my phone, and you should enjoy yours ;-)
In OpenGL ES 2.0 there is only one standard texture compression format - ETC. It's the only one you can rely on in all conformant GPUs. Others like ATITC, PVRTC and S3TC/DXTC are proprietary formats not suitable if you want your app to run on new devices.
at the moment since my hd2 has been retired i opted to buy a samsung galaxy s 4g worst mistake i have made in a long time... bought a sidekick 4g for my gf and that phone was another example of how poor quality control if for samsung or how they rush to get a head start in the market... the phone is riddled with problems
Currykiev said:
Wow, that was a slight over reaction! Do you work for HTC or something because that was a very defensive reply.
i9100 actually speaks some truth. There is a very informed topic over on the SGS2 forum about this which i suggest you guys read before making any more statements which are incorrect.
Still, looking forward to seeing this phone released to see if it's as impressive as the S2 is. I'm hoping it will be.
Click to expand...
Click to collapse
lol its not as bad as what i have read in the sgs2 forum where people are reporting their problems and trying to get legitimate help and are being bashed and called trolls because their phone honestly has a problem

Electopia Benchmark

For giggles, can one of you that's stock run the Electopia benchmark? There's been some interesting results and it would be cool to see how another dual-core phone with a different CPU/GPU performs. The Sensation folks are obviously not amused.
Sensation
800x480
Average FPS: 23.65
Time: 60
Number of Frames: 1419
Trianglecount: 48976
Peak Trianglecount: 68154
960x540
Average FPS: 19.90
Time: 60.01
Number of Frames: 1194
Trianglecount: 49415
Peak Trianglecount: 67076
SGS2
Average FPS: 37.58
Time: 60.01
Number of frames: 2255
Trianglecount: 48633
Peak trianglecount: 68860
DHD
Average FPS: 23.36
Time: 60.03
Number of frames: 1402
Trianglecount: 48835
Peak trianglecount: 67628
Even the Desire HD blew away my G2x on this benchmark but it could be the custom ROM... I'll switch back to AOSP and try it again.
16FPS
Can't be right, my Thunderbolt smoked my g2x
26 FPS Thunderbolt vs 16FPS G2x
Something is very wrong with those numbers if this is supposed to be measuring opengl 2.0
I have stock and with a really hard time getting it to respond to touch input and with the sound off here are the scores:
Average FPS - 15.56
Time - 60.04
Number of Frames - 934
Trianglecount - 48928
Peak Trianglecount - 68838
This was a super buggy program on the G2x. I think it is definitely not optimized for dual core or at least the Tegra 2 architecture.
Sent from my T-Mobile G2x using XDA App
There's no way the G2X would be lower than the Sensation. The test probably isn't dual-core optimized and the new CPU/GPUs are throwing it off. Thanks for trying though.
BarryH_GEG said:
There's no way the G2X would be lower than the Sensation. The test probably isn't dual-core optimized and the new CPU/GPUs are throwing it off. Thanks for trying though.
Click to expand...
Click to collapse
And there is no way the g2x could be lower than a single core adreno 205 Thunderbolt.
15.57 FPS for me running stock/not rooted. Like previously mentioned, it was very unresponsive to touch.
Badly designed benchmark programs are bad.
diablos991 said:
Badly designed benchmark programs are bad.
Click to expand...
Click to collapse
The sad part is that this isn't just a benchmark - its a game first and foremost.
And yeah I can't get past 16FPS on stock speed OR at 1.5GHz so I think there's definitely coding issues as Nenamark using Trinity on Bionic scores 72FPS. I think my Inspire (Adreno 205) got about 35?
+1
Lets all buy phones with top benchmarks!!!!!!
Better yet lets all get iPhones.....
Fu*k a benchmark
Sent from my LG-P999 using XDA Premium App
But if you really stop and think about it, if each of the different CPU/GPU's behave differently running the same software because of proprietary hardware performance tweaks we'll all be screwed in the long run. No matter how Electopia was written, one would think it would behave the same way on different CPU/GPU combinations - even if it wasn't dual-core optimized. So developers are either going to have to start testing on every CPU/GPU combo to release a single version of an app or release different apps for different CPU/GPUs. It's way too early to tell as dual-core and 2.3ish isn't that common now but it should be interesting watching software performance and development play out in the future.
BarryH_GEG said:
But if you really stop and think about it, if each of the different CPU/GPU's behave differently running the same software because of proprietary hardware performance tweaks we'll all be screwed in the long run. No matter how Electopia was written, one would think it would behave the same way on different CPU/GPU combinations - even if it wasn't dual-core optimized. So developers are either going to have to start testing on every CPU/GPU combo to release a single version of an app or release different apps for different CPU/GPUs. It's way too early to tell as dual-core and 2.3ish isn't that common now but it should be interesting watching software performance and development play out in the future.
Click to expand...
Click to collapse
Its piss-poor coding on the app developer's part - plain and simple. While there are Tegra 2-specific instructions that an app developer can use in their application, there are not any mobile OpenGL 2.0 instructions the Tegra 2 doesn't support as far as I am aware.
If you want a good challenge for the chip, download an3dbench XL from Market. I just scored 32640 and that's with a bunch of background apps.
Isn't this a windows mobile port (had it on my HD2 running WM6.5)? So, how does it provide an accurate representation of gaming on an Android device? Since it is the only bench my G2x has scored poorly on and (more importantly) real world gaming is spectacular on this thing, I'm going to say it doesn't. I wouldn't put a whole lot of stock in this one...
Yeah agreed. I just ran it on the Nexus/CM7 AOSP hybrid and it still was only 16.06 while I got almost 40,000 on an3dbenchXL which put me like 30-something out of 7000ish results.
This application was influenced by Qualcomm specifically to run poorly on Tegra 2 devices. They messed with the shaders so everything is rendered at a weird angle. If you change the code to run with a normal approach, you see the same results on Qualcomm chips but also 3-5x perf on NVIDIA chips
why would you say this benchmark was influenced? if you have the sources ..please share .. so we can all look ... and how can you say BenchXL is a good benchmark? I have run BenchXL Benchmark and seen un matching results on many forums ... it is very unreliable... not a good benchmark. At least electopia gives consistent reliable results... I would go with electopia as a GPU benchmark ..
i have a xperia play for myself - which performs superb for gaming - awesome graphics - i love the games on it - awesome device. my wife has g2x - which is equally good for gaming (thought she just uses it for texting - LOL )....
i think for gaming both xperia play and g2x are good...
I'd hardly say it's biased to any one specific manufacturer based on these benchmarks:
More so I ran it myself with the latest firmware at stock frequencies (SGS2 btw ) and got:
Average FPS: 51.44
Time: 60.02
Number of frames: 3087
Trianglecount: 48827
Peak trianglecount: 68868
Quite funny difference to any other device I might say.
It's not biased towards any manufacturer, it is biased against NVIDIA's ULP GeForce GPUs in Tegra 2 SOCs.
Changes to the code cause increases in performance on Tegra 2 devices, while results on other platforms do not change.
In general, there is never a single, all-encompassing GPU benchmark to accurately compare devices. It all depends on the code, and how it interacts with the specific hardware of the device.
images |DOT| anandtech |DOT| com /graphs/graph4177/35412.png
images |DOT| anandtech |DOT| com /graphs/graph4177/35412.png
Source: Anandtech Samsung Galaxy S2 review (I can't post links )
http://images.anandtech.com/graphs/graph4177/35412.png
That AnandTech review is badly outdated, like I said; the SGS2 gets for example 16fps there in February. I myself get 58fps today.
And I don't think it's biased against Tegra. Tegra performs pretty much there where it should be considering its age, and corresponds to it's specs.
And just to prove dismiss your point that Tegra gets a different codepath, I ran Electopia Bench again via Chainfire3D using the NVIDIA GL wrapper plugin emulating said device and I'm still getting the same amount of FPS.
If what you're saying is that it's not utilizing Tegra's full potential through proprietary Nvidia OpenGL extensions, might as well pack the bag and leave because then that logic would apply to pretty much every graphics core since it's not optimized for it. What we see here in these benchmarks is a plain simple ES 2.0 codepath which all devices should support and so we can do an oranges to oranges comparision. It's also one of the heaviest fragment-shader dependent benchmarks out there for the moment, and less geometry and texture bound, and that's why it runs so badly on pretty much every chip, since they don't get this type of workload in other benchmarks. This is also why the Mali gets such high FPS as that's where the quad GPU setup in the Exynos can shine.
AndreiLux said:
I'd hardly say it's biased to any one specific manufacturer based on these benchmarks:
More so I ran it myself with the latest firmware at stock frequencies (SGS2 btw ) and got:
Average FPS: 51.44
Time: 60.02
Number of frames: 3087
Trianglecount: 48827
Peak trianglecount: 68868
Quite funny difference to any other device I might say.
Click to expand...
Click to collapse
It's clearly MALI 400 in SGS2 is most powerful GPU right now. There is a 60fps limit on Galaxy S2, so you'll need a powerful benchmark. You can also see that in Nenamark2 too. SGS2=47fps, G2X=28fps, SGS=24fps

TEGRA 4 - 1st possible GLBenchmark!!!!!!!! - READ ON

Who has been excited by the Tegra 4 rumours?, last night's Nvidia CES announcement was good, but what we really want are cold-hard BENCHMARKS.
I found an interesting mention of Tegra T114 SoC on a Linux Kernel site, which I've never heard of. I got really interested when it stated that the SoC is based on ARM A15 MP, it must be Tegra 4. I checked the background of the person who posted the kernel patch, he is a senior Nvidia Kernel engineer based in Finland.
https://lkml.org/lkml/2012/12/20/99
"This patchset adds initial support for the NVIDIA's new Tegra 114
SoC (T114) based on the ARM Cortex-A15 MP. It has the minimal support
to allow the kernel to boot up into shell console. This can be used as
a basis for adding other device drivers for this SoC. Currently there
are 2 evaluation boards available, "Dalmore" and "Pluto"."
On the off chance I decided to search www.glbenchmark.com for the 2 board names, Dalmore (a tasty whisky!) and Pluto (Planet, Greek God and cartoon dog!) Pluto returned nothing, but Dalmore returned a device called 'Dalmore Dalmore' that was posted on 3rd January 2013. However the OP had already deleted them, but thanks to Google Cache I found the results
RESULTS
GL_VENDOR NVIDIA Corporation
GL_VERSION OpenGL ES 2.0 17.01235
GL_RENDERER NVIDIA Tegra
From System spec, It runs Android 4.2.1, a Min frequency of 51 MHz and Max of 1836 Mhz
Nvidia DALMORE
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) : 32.6 fps
iPad 4
GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p): 49.6 fps
CONCLUSION
Anandtech has posted that Tegra 4 doesn't use unified shaders, so it's not based on Kepler. I reckon that if Nvidia had a brand new GPU they would have shouted about it at CES, the results I've found indicate that Tegra 4 is between 1 to 3 times faster than Tegra 3.
BUT, this is not 100% guaranteed to be a Tegra 4 system, but the evidence is strong that it is a T4 development board. If this is correct, we have to figure that it is running beta drivers, Nexus 10 is ~ 10% faster than the Arndale dev board with the same Exynos 5250 SoC. Even if Tegra 4 gets better drivers, it seems like the SGX 544 MP4 in the A6X is still the faster GPU, with Tegra 4 and Mali T604 being an almost equal 2nd. Nvidia has said that T4 is faster than A6X, but the devil is in the detail, in CPU benchmarks I can see that being true, but not for graphics.
UPDATE - Just to add to the feeling that that this legit, the GLBenchmark - System section lists the "android.os.Build.USER" as buildbrain. Buildbrain according to a Nvidia job posting is "Buildbrain is a mission-critical, multi-tier distributed computing system that performs mobile builds and automated tests each day, enabling NVIDIA's high performance development teams across the globe to develop and deliver NVIDIA's mobile product line"
http://jobsearch.naukri.com/job-lis...INEER-Nvidia-Corporation--2-to-4-130812500024
I posted the webcache links to GLBenchmark pages below, if they disappear from cache, I've saved a copy of the webpages, which I can upload, Enjoy
GL BENCHMARK - High Level
http://webcache.googleusercontent.c...p?D=Dalmore+Dalmore+&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - Low Level
http://webcache.googleusercontent.c...e&testgroup=lowlevel&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - GL CONFIG
http://webcache.googleusercontent.c...Dalmore&testgroup=gl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - EGL CONFIG
http://webcache.googleusercontent.c...almore&testgroup=egl&cd=1&hl=en&ct=clnk&gl=uk
GL BENCHMARK - SYSTEM
http://webcache.googleusercontent.c...ore&testgroup=system&cd=1&hl=en&ct=clnk&gl=uk
OFFSCREEN RESULTS
http://webcache.googleusercontent.c...enchmark.com+dalmore&cd=4&hl=en&ct=clnk&gl=uk
http://www.anandtech.com/show/6550/...00-5th-core-is-a15-28nm-hpm-ue-category-3-lte
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
i9100g user said:
Is there any Gpu that could outperform iPad4 before iPad5 comes out? adreno 320, t Mali 604 now tegra 4 aren't near it. Qualcomm won't release anything till q4 I guess, and tegra 4 has released too only thing that is left is I guess is t Mali 658 coming with exynos 5450 (doubtfully when it would release, not sure it will be better )
Looks like apple will hold the crown in future too .
Click to expand...
Click to collapse
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start mapple fans will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
32fps is no go...lets hope it's not final
hamdir said:
32fps is no go...lets hope it's not final
Click to expand...
Click to collapse
It needs to, but it will be OK for a new Nexus 7
still faster enough for me, I dont game alot on my nexus 7.
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
italia0101 said:
I know I'm taking about phones here ... But the iPhone 5 GPU and adreno 320 are very closely matched
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
adityak28 said:
From what I remember the iPhone 5 and the new iPad wiped the floor with Nexus 4 and 10. The ST-Ericsson Nova A9600 is likely to have a PowerVR Rogue GPU. Just can't wait!!
Click to expand...
Click to collapse
That isn't true , check glbenchmark , in the off screen test the iPhone scored 91 , the nexus 4 scored 88 ... That ksnt wiping my floors
Sent from my Nexus 10 using Tapatalk HD
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
ian1 said:
Its interesting how even though nvidia chips arent the best we still get the best game graphics because of superior optimization through tegra zone. Not even the a6x is as fully optimized.
Sent from my SAMSUNG-SGH-I727 using xda premium
Click to expand...
Click to collapse
What sort of 'optimisation' do you mean? un optimised games lag that's a big letdown and tegra effects can also be used on other phones too with chain fire 3d I use it and tegra games work without lag with effects and I don't have a tegra device
With a tegra device I am restricted to optimised games mostly
The graphic performance of NVIDIA SoCs is always disappointed, sadly for the VGA dominanting provider on the world.
The first Tegra2, the GPU is a little bit better than SGX540 of GalaxyS a little bit in benchmark, but lacking NEON support.
The second one Tegra 3, the GPU is nearly the same as the old Mali400Mp4 in GALAXY S2/Original Note.
And now it's better but still nothing special and outperformed soon (Adreno 330 and next-gen Mali)
Strongest PowerVR GPUs are always the best, but sadly they are exclusive for Apple only (SGX543 and maybe SGX 554 also, only Sony ,who has the cross-licencing with Apple, has it in PS Vita and in PS Vita only)
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
now about power vr they are only better in real multicore configuration which is only used by apple and Sony's vita, eating large die area, ie actual multicore each with its own subcores/shaders, if tegra was used in real multi core it would destroy all
finally this is really funny all this doom n gloom because of an early discarded development board benchmark, I dont mean to take away from turbo's thunder and his find but truly its ridiculous the amount of negativity its is collecting before any type of final device benchs
adrena 220 doubled in performance after the ICS update on sensation
t3 doubled the speed of t2 gpu with only 50% the number of shaders so how on earth do you believe only 2x the t3 scores with 600% more shaders!!
do you have any idea how miserable the ps3 performed in its early days? even new desktop GeForces perform much less than expected until the drivers are updated
enough with the FUD! seems this board is full of it nowadays and so little reasoning...
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Well said mate!
I can understand what you feel, nowdays android players like samsung,nvidia are focusing more on CPU than GPU.
If they won't stop soon and continued to use this strategy they will fail.
GPU will become bottleneck and you will not be able use the cpu at its full potential. (Atleast when gaming)
i have Galaxy S2 exynos 4 1.2Ghz and 400mhz oc mali gpu
In my analysis most modern games like MC4,NFS:MW aren't running at 60FPS at all thats because GPU always have 100% workload and CPU is relaxing there by outputing 50-70% of total CPU workload
I know some games aren't optimize for all android devices as opposed to apple devices but still even high-end android devices has slower gpu (than ipad 4 atleast )
AFAIK, Galaxy SIV is likely to pack T-604 with some tweaks instead of mighty T-658 which is still slower than iPAddle 4
Turbotab said:
There was a great article on Anandtech that tested the power consumption of the Nexus 10's Exynos 5250 SoC, it showed that both the CPU and GPU had a TDP of 4W, making a theoretical SoC TDP of 8W. However when the GPU was being stressed by running a game, they ran a CPU benchmark in the background, the SoC quickly went up to 8W, but the CPU was quickly throttled from 1.7 GHz to just 800 Mhz as the system tried to keep everything at 4W or below, this explained why the Nexus 10 didn't benchmark as well as we wished.
Back to the 5450 which should beat the A6X, trouble is it has double the CPU & GPU cores of the 5250 and is clocked higher, even on a more advanced 28nm process, which will lower power consumption I feel that system will often be throttled because of power and maybe heat concerns, so it looks amazing on paper but may disappoint in reality, and a 5450 in smartphone is going to suffer even more.
So why does Apple have an advantage?, well basically money, for a start iSheep will pay more for their devices, so they afford to design a big SoC and big batteries that may not be profitable to other companies. Tegra 4 is listed as a 80mm2 chip, iPhone 5 is 96mm2 and A6X is 123mm2, Apple can pack more transistors and reap the GPU performance lead, also they chosen graphics supplier Imagination Technologies have excellent products, Power VR Rogue will only increase Apple's GPU lead. They now have their own chip design team, the benefit for them has been their Swift core is almost as powerful as ARM A15, but seems less power hungry, anyway Apple seems to be happy running slower CPUs compared to Android. Until an Android or WP8 or somebody can achieve Apple's margins they will be able to 'buy' their way to GPU domination, as an Android fan it makes me sad:crying:
Click to expand...
Click to collapse
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
MrPhilo said:
For goodness sake, this isn't final hardware, anything could change. Hung2900 knows nothing, what he stated isn't true. Samsung has licensed PowerVR, it isn't just stuck to Apple, just that Samsung prefers using ARMs GPU solution. Another thing I dislike is how everyone is comparing a GPU in the iPad 4 (SGX554MP4) that will NEVER arrive in a phone compared a Tegra 4 which will arrive in a phone. If you check OP link the benchmark was posted on the 3rd of January with different results (18fps then 33fps), so there is a chance it'll rival the iPad 4. I love Tegra as Nvidia is pushing developers to make more better games for Android compared to the 'geeks' *cough* who prefers benchmark results, whats the point of having a powerful GPU if the OEM isn't pushing developers to create enhance effect games for there chip.
Hamdir is correct about the GPUs, if Tegra 3 was around 50-80% faster than Tegra 2 with just 4 more cores, I can't really imagine it only being 2x faster than Tegra 3. Plus its a 28nm (at around 80mm2 just a bit bigger than Tegra 3, smaller than A6 90mm2) along with the dual memory than single on Tegra 2/3.
Click to expand...
Click to collapse
Firstly please keep it civil, don't go around saying that people know nothing, people's posts always speak volumes. Also calling people geeks, on XDA is that even an insult, next you're be asking what I deadlift:laugh:
My OP was done in the spirit of technical curiosity, and to counter the typical unrealistic expectations of a new product on mainstream sites, e.g. Nvidia will use Kepler tech (which was false), omg Kepler is like GTX 680, Tegra 4 will own the world, people forget that we are still talking about device that can only use a few watts, and must be passively cooled and not a 200+ watt, dual-fan GPU, even though they both now have to power similar resolutions, which is mental.
I both agree and disagree with your view on Nvidia's developer relationship, THD games do look nice, I compared Infinity Blade 2 on iOS vs Dead Trigger 2 on youtube, and Dead Trigger 2 just looked richer, more particle & physics effects, although IF Blade looked sharper at iPad 4 native resolution, one of the few titles to use the A6x's GPU fully.The downside to this relationship is the further fragmentation of the Android ecosystem, as Chainfire's app showed most of the extra effects can run on non Tegra devices.
Now, a 6 times increase in shader, does not automatically mean that games / benchmarks will scale in linear fashion, as other factors such as TMU /ROP throughput can bottleneck performance. Nvidia's Technical Marketing Manager, when interviewed at CES, said that the overall improvement in games / benchmarks will be around 3 to 4 times T3. Ultimately I hope to see Tegra 4 in a new Nexus 7, and if these benchmarks are proved accurate, it wouldn't stop me buying. Overall including the CPU, it would be a massive upgrade over the current N7, all in the space of a year.
At 50 seconds onwards.
https://www.youtube.com/watch?v=iC7A5AmTPi0
iOSecure said:
Typical "isheep" reference, unnecessary.
Why does apple have the advantage? Maybe because there semiconductor team is talented and can tie the A6X+PowerVR GPU efficiently. NIVIDA should have focused more on GPU in my opinion as the CPU was already good enough. With these tablets pushing excess of 250+ppi the graphics processor will play a huge role. They put 72 cores in there processor. Excellent. Will the chip ever be optimized to full potential? No. So again they demonstrated a product that sounds good on paper but real world performance might be a different story.
Click to expand...
Click to collapse
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
hamdir said:
tegra optimization porting no longer works using chainfire, this is now a myth
did u manage to try shadowgun thd, zombie driver or horn? the answer is no, games that use t3 sdk for physx and other cpu graphics works can not be forced to work on other devices, equally chainfire is now outdated and no longer updated
Click to expand...
Click to collapse
Looks like they haven't updated chain fire 3d for a while as a result only t3 games don't work but others do work rip tide gp, dead trigger etc . It's not a myth but it is outdated and only works with ics and tegra 2 compatible games . I think I (might be) unfortunate too but some gameloft games lagged on tegra device that i had, though root solved it too an extent
I am not saying something is superior to something just that my personal experience I might be wrong I may not be
Tbh I think benchmarks don't matter much unless you see some difference in real world usage and I had that problem with tegra in my experience
But we will have to see if the final version is able to push it above Mali t 604 and more importantly sgx544
Turbotab said:
Sorry Steve, this is an Android forum, or where you too busy buffing the scratches out of your iPhone 5 to notice? I have full respect for the talents of Apple's engineers & marketing department, many of its users less so.
Click to expand...
Click to collapse
No I actually own a nexus 4 and ipad mini so I'm pretty neutral in googles/apples ecosystems and not wiping any scratches off my devices.

PCMark: Note3 out-performs Note4

See benchmark details here
Top scores....
Note 3: 5130
Note 4: 4942
Duh...
Quad-core 1.3 GHz Cortex-A53 & Quad-core 1.9 GHz Cortex-A57 (SM-N910C)
quad-core 1.3 GHz Cortex-A7 & Quad-core 1.9 GHz Cortex-A15 (N9000)
The Exynos CPU in the N3 and N4 hava exactly the same speed... And yet the N9005 only has a 1920x1080 screen, whereas the Note 4 has to render 2560x1440.
Thank you for proving why I absolutely hate Exynos.
I'd like to know the Snapdragon variants. Since the Note 4 does have a significantly more powerful Snapdragon CPU, and the Snapdragon is the 80% of the market model, the Exynos is only for lower markets.
Quad-core 2.7 GHz Krait 450 (SM-N910S)
Quad-core 2.3 GHz Krait 400 (N9005)
But the Exynos in the Note 4 is pretty awesome already:
http://anandtech.com/show/8718/the-samsung-galaxy-note-4-exynos-review
Good things are to come with the one in the Galaxy S6.
If you would run the PCMark test yourselves and post the results, that would be great!!
Thanks
ShadowLea said:
the Exynos is only for lower markets.
Click to expand...
Click to collapse
This is kind of offending!, and I am non emotional guy who hates Exynos too :|
devilsdouble said:
This is kind of offending!, and I am non emotional guy who hates Exynos too :|
Click to expand...
Click to collapse
If a market sells less or requires less high-level hardware due to an older, less sophisticated network system, it's considered a lower market. The demand and proceeds are lower compared to the high-selling markets, thus the word lower.
That's not a personal attempt at insult, it's a corporate definition.
Until 4G was rolled out, the Netherlands was one of those lower markets. (Though, frankly, I still consider it as such..) In the days of the S3, every non-US country was considered a lower market.
(Besides, I'm a sociopath, I don't do emotional )
Marketing aside: Temasek's CM12 + arter97 kernel + data&cache partitions in f2fs.
The phone is superfast as hell, but benchmark result was this:
Times are changing, for the worse and for better, i know it makes no sense, but so doesnt sammy.
They seem to drop Snapdragon, and with 810 in sight (ignored too), Exynos is going for a PR fight with overheating accusations, and being the sucky ones in performance and the best in sales (Samsung generally), they just made their phones even less open to the people, HOWEVER...they are dropping bloat too.
As i said, they are making no sense.
sirobelec said:
Marketing aside: Temasek's CM12 + arter97 kernel + data&cache partitions in f2fs.
The phone is superfast as hell, but benchmark result was this:
Click to expand...
Click to collapse
Stock Note N900 seems to perform better
PCMark for Android claims to......
Measure the performance and battery life of your Android smart phone and tablet using tests based on everyday tasks, not abstract algorithms.
ShadowLea said:
If a market sells less or requires less high-level hardware due to an older, less sophisticated network system, it's considered a lower market. The demand and proceeds are lower compared to the high-selling markets, thus the word lower.
That's not a personal attempt at insult, it's a corporate definition.
Until 4G was rolled out, the Netherlands was one of those lower markets. (Though, frankly, I still consider it as such..) In the days of the S3, every non-US country was considered a lower market.
(Besides, I'm a sociopath, I don't do emotional )
Click to expand...
Click to collapse
ShadowLea said:
Duh...
Quad-core 1.3 GHz Cortex-A53 & Quad-core 1.9 GHz Cortex-A57 (SM-N910C)
quad-core 1.3 GHz Cortex-A7 & Quad-core 1.9 GHz Cortex-A15 (N9000)
The Exynos CPU in the N3 and N4 hava exactly the same speed... And yet the N9005 only has a 1920x1080 screen, whereas the Note 4 has to render 2560x1440.
Thank you for proving why I absolutely hate Exynos.
I'd like to know the Snapdragon variants. Since the Note 4 does have a significantly more powerful Snapdragon CPU, and the Snapdragon is the 80% of the market model, the Exynos is only for lower markets.
Quad-core 2.7 GHz Krait 450 (SM-N910S)
Quad-core 2.3 GHz Krait 400 (N9005)
Click to expand...
Click to collapse
Why don't you simply run the test yourself with the superior phone/network you have and let the results speak for themselves?
PCMark for android
4354 here UK note 3
If Samsung do end up dropping Qualcomm in their next generation of phones, my N9005 Note 3 will be my last Samsung for the foreseeable future. Exynos holds no interest for me, as it's closed source nature inevitably means little to no support for non-stock AOSP/CM roms. And the non-stock roms that are available are generally unstable and bug ridden.
^ +100
We know S6 is not going to have S810, why wouldnt they follow the same path with Note's too?
SM-N9005 is my last Samsung device, i am not going to drag myself to pain with Exynos.
New top score... 5130
Benchmark scores between flagship phones mean precisely jack s**t these days, they're little more than **** waving. Discernible features is what should be compared.
"Wow, my Android phone scored 200 more points than your Android phone! And please, let's ignore the fact it will make precisely zero difference in real world use!"
Beefheart said:
Benchmark scores between flagship phones mean precisely jack s**t these days, they're little more than **** waving. Discernible features is what should be compared.
"Wow, my Android phone scored 200 more points than your Android phone! And please, let's ignore the fact it will make precisely zero difference in real world use!"
Click to expand...
Click to collapse
Ignorance is bliss!!
The whole point of these tests is to show that most of the other benchmarks don't show a true picture of real-life use.
Why else would Note 3 appear to perform better than Note4?
The PCMark webpage states the following...
PCMark for Android introduces a fresh approach to benchmarking smart phones and tablets. It measures the performance and battery life of the device as a complete unit rather than a set of isolated components. And its tests are based on common, everyday tasks instead of abstract algorithms.
Click to expand...
Click to collapse
Yeah, that completely changed my opinion.*
* may contain sarcasm.
Beefheart said:
Yeah, that completely changed my opinion.*
* may contain sarcasm.
Click to expand...
Click to collapse
Have it your way... at least , I, am actually investigating
It's in the interest of the benchmark app developers for users to believe their offerings aren't pointless.
Beefheart said:
It's in the interest of the benchmark app developers for users to believe their offerings aren't pointless.
Click to expand...
Click to collapse
I agree with you on this....... generally.
I however found this particular benchmark interesting for the following reasons....
1. It proves software is the biggest bottleneck in android phones, not hardware. ( Lollipop on Note3 >>beats>> kitkat onNote4 )
2. It proved that my Note 3 performs better in everyday use than my Note4 ( This I have always known but no benchmark showed it.)

Categories

Resources