“Several Android tablets running on Intel Clover Trail+ Atom processors broke cover at Computex Taiwan. Intel’s dual-core, 1.6GHz Atom Z5260 is fueling a Samsung Galaxy Tab 3 10.1 tablet, as well as Asus’s 6-inch Fonepad Note and 10-inch MemoPad FHD10 tablets, while Asus also unveiled a hybrid 11.6-inch Transformer Book Trio, combining an Android slate based on a 2GHz dual-core Atom Z2580 with a keyboard dock running Windows 8 on an Intel Haswell processor.”
I have a Core 2 duo processor that is pretty similar to these things on my laptop. The performance is crap unless there is an SSD.
I cannot agree. I’ve got a 17″ laptop with a Core 2 Duo on it and it runs both Windows and Linux just fine, no problems at all with the things people generally do with their devices. Booting Windows is a bit slow, Ubuntu boots faster, but once it’s up and running it’s just peachy. Granted, I haven’t tried Android x86 on it, but I doubt the difference would be anything but negligible.
Besides, an SSD only matters for load-times, not once the application is already running.
It depends whether it is an ULV processor or not. I am clocking about 1.4ghz and it is slow as molasses until I upgraded it to SSD.
Edited 2013-06-05 07:14 UTC
Unless you are paging. In this case the SSD will help immensely.
BTW, I actually put a SSD in my old 98SE (Pentium MMX 233MHz) game system and the difference was incredible. SSDs make old systems more usable. Even really really old ones.
Nah, WereCatf is right. The SSS only helps in program loading and battery life.
Your laptop likely had a 4200 RPM drive which have always been slow as hell, thus the CPU would be sitting around idling while the painfully slow HDD located and loaded the data.
You’d have also seen a sizable performance increase in swapping to a 7200RPM drive. Not as much as the SSD, but still quite noticeable in human time.
What happened to the conspiracy theorists who came out of the woodwork and said that Clover Trail being Windows 8 exclusive initially was the end of the world?
Shhhh, they were hoping you wouldn’t mention that!!!
Well since the technology isn’t windows 8 only, it won’t be a problem. Besides, while I do remember a fuss, I would hardly call it end of world. Intel is too heavily invested in Linux. Now if Microsoft tried that kind of move, I would worry.
Lets look back:
Thom:
“As such, I would say this is nothing but a deal with Microsoft to make it harder for Android to run on this platform. While the lack of support won’t deter a single enthusiast (Linux will eventually run on this just fine), it will make it impossible for OEMs to build Clover Trail machines for Android.”
TechGeek (hey, thats you!):
“Make no mistake about it, Microsoft is targeting Linux/Android with these decisions.”
Bram:
“Don’t worry: it will all pass. Both Microsoft and Intel are digging there own graves here. The last few spasms of two slowly dying former heavy weights. ”
And I’m sure you remember we got the associated wall of text from Thom’s soapbox.
LOL.
When battery live figures are out we’ll see.
Booting an OS is just a first step.
I suspect they’re holed up with the brain donors who predicted that Win2k would be a failure because “OMG 63,000 bugs!” And the people who predicted that XP would be a failure because of “OMG Activation!” And the people who predicted that Vista would be a failure because of “OMG Palladium!” (ad infinitum)
I seriously hope these intel androids fail. They will dramatically increase the fragmentation problem. 5% of apps don’t work? That is way too many.
Also, I’m bored with Intel and am hoping for the entire Intel/Microsoft ship to sink when Android hybrids eviscerate laptop sales.
This play by Intel is *high stakes*. They see where things are headed: Win8 is a failure, tablets/phones are taking over and could spread like wildfire to hybrids and all laptops within a couple years if MS doesn’t do something ASAP. If that happens, and Android remains 100% ARM, Intel could implode. I think Intel realizes this isn’t just a simple play to make more money, this is life or death for them.
(evil cackling…)
Edited 2013-06-05 01:20 UTC
This will not make Intel implode. Intel has paws into nearly every facet of hardware. Besides, I don’t want Intel to fail. I would like for them to lose a little market share, but not to fail. Having Intel around is good for competition. Competition in turn will push Intel, AMD, and all ARM manufacturers to step it up and put out better products.
Clearly you’re not part of the current generation of tech enthusiasts – if the iFanboys and GNU/freetards have taught us anything, is that the bigger picture is completely irrelevant now. Instead, all that matters is that your “team” wins and everyone else loses.
Healthy competition? Pffft, what an adorably-quaint concept – lol back to the retirement home grandpa lol!
History is not on your side. Intel will likely be very successful, and for a time we’ll have more powerful, more power-efficient, more-inexpensive chips in our phones and tablets. That is only a good thing, right?
Give it a few years, and it might be 5% of apps don’t work on ARM-based laptops and tablets. Fragmentation has always been a side-effect of technological advancement. Besides, why would you want a monoculture in the tablet market? Monocultures hardly breed innovation.
Also, it is hardly life-or-death for Intel, as there are still a whole lot of buyers of laptops, desktops, workstations, servers, and super computers. This is not going to change as long as real work needs to be done.
Oh yeah. Intel monopoly is definitely not.
You can see desktop Haswell is barely faster than 2yo SandyBrigde, and sometimes slower than Ivy. Once again the motherboard is not compatible. Whopping 3% speedup is definitely worth that price.
ARM universe has a long forgotten thing called competition, while x86 has only one player (AMD is not even considered as viable player by Intel itself).
Wrong, AMD has always been Intel’s research and development firm. Though with the purchase of ATI AMD has allot of new tech it isn’t required to stare with Intel.
Take a look at what AMD has cooked up for the Kaveri line of APUs and the next gen consoles. AMD doesn’t have to have the best integer performance if they are going to be putting an HD7750 on the same die as their steamroller cores and is adding GDDR5 to the system memory pool. Apparently allowing you to have both DDR3 and GDDR5 sticks installed on the same mobo with the CPU ad GPU being able to use both simultaneously and share data between each other with no latency instead of having to pass everything through the PCIe bus and making multiple reads and writes to the system and vram. The results are far faster GPGPU and physics calculations as well as anything needing to use the FPU just redirecting to the GPU.
Once the new consoles are out the game devs aren’t going to be looking to optimize much at all for Nvidia hardware, since AMD is powering them all, even the lowly WiiU.
Even if Intel where to buy Nvidia tomorrow they would be a minimum 3 years out from having something comparable to Kaveri.
The updated Richland APUs are just to keep a solid lead in the iGPU department over Intel and are slated to only have a life of 6 months.
Haswell Iris 5200 integrated GPU outperforms A10-5800k (HD7660) discrete card at 1/2 of power.
i7-4950HQ has 128MB of EDRAM L4 cache on package.
GDDR5 memory is not “sticks” – it is soldered on PCB. Also you can’t use both at the same time.
Do you understand that consoles and PCs are different optimization targets?
Wonderful, but how does it stack up to the A10-6800K’s HD8670D? Will it be on a chip that only costs $150 or will it be only on chips priced so far up the ladder that anyone looking for it will go with a discrete card?
look more at what is going on with Kaveri and the hUMA memory architecture, remember that for the vast majority of people out there the CPU has been more then good enough for years now, thats why they buy based on price, they don’t know what a GPU is, but if word gets around that these new computers can game around as good as any console they are golden.
Consoles and comps may be different targets, but what you have to do to get your game to render fastest on the GCN architecture wont change much between the two. Since a large portion of PC games are being ported from the consoles this will only give AMD that much more of an advantage as the game devs will already very familiar with the hardware’s quirks without having to be paid off for that to happen as you see with Nvidia “The Way It’s Meant To Be Played” titles.
Do you have a link for some review/benchmarks which actually shows that? My Intel-loving buddy couldn’t find one, says Haswell is generally faster across the board
That is what happens when you go the C/C++ route and don’t produce binaries for all platforms on your apk.
Actually, if I remember correctly, Intel bundles their libhoudini library, which emulates the ARM(v7?) architecture. That library is why Android emulators, like AndroVM, can run ARM applications
Isn’t C/C++ a “portable assembler” ? Why it’s so difficult to check “x86 build” as well as “ARM build” for the same APK ? Package size ? Then producing two APK, one per CPU ? Look at MXPlayer, they produce plugins for ARM extensions as separate APKs, and nobody complains. If somebody “forget” to build the x86 version, I can only see this person as inadequate and should change of professional orientation. McDonalds is always recruiting.
BTW Intel provides a virtualization accelerator for the Android emulator on Windows, but not AMD. Why ? AMD do features virtualization, so why this lack of Windows driver, while the Linux version is virtualized ?
Kochise
Edited 2013-06-05 07:17 UTC
APK size is really important, as it is one of the reasons people might uninstall APKs.
At this year’s Google IO, reducing APK size was one of the advices to how to improve sales, specially if you depend on what the user does with the app.
Many developers don’t build x86 APK, because there are almost no device worth targeting with a x86 CPU. So why increase the APK size?
Blame the phone manufacturers for being skimpy with the onboard flash size. Phones should also have multiple MicroSD card slots, have one dedicated to allowing programs to be loaded from it.
Theres no reason your phone shouldn’t be capable of holding 128Gb+ of stuff, it’d ease up on data usage for music and video if more could be stored locally without having to decide between holding an album or a game.
Google Nexus, HTC One, …
Kochise
For an application that requires little CPU power, coded in Java targeting Dalvik should be enough. The JIT is good, most 2D games are OK with it.
When you need to address the GPU more specifically and/or fine tune your build, the NDK becomes needed, thus a native compilation. These “demanding” applications (3D games) comes already with lots of data.
So I can understand that waste should be avoided, and that’s a good thing not to comes to GB applications with little features. But when installing a “demanding” application that requires native compilation, one should expect the size to be set accordingly.
Hence, having an APK to weight 5 MB more due of the embedded x86 build is a false problem. Especially if one can built two APKs targeting each CPU.
Kochise
You are forgetting two major use cases:
– Many developers will avoid touching Java as much as they can;
– C and C++ are one way to write portable native applications across devices.
Edit: typo
Edited 2013-06-05 09:32 UTC
It is a prerequisite on Android anyway, so what’s the point ?
“Native” applications (if not forced to write applications in Objective-C).
Well, Java is more “portable” (write one, execute many) instead to .Net (write many -C#/VBA/F#/…-, execute one -.Net-)
You can easily scramble a C/C++ code with gazillions of framework/wrapper, thus rendering the final code not that “portable” anymore. Dependencies and compilator “optimizations” makes things relatively cumbersome.
Kochise
Not if you are using native activities and frameworks that wrap JNI calls for you. There are plenty to choose from.
As there are JNI frameworks for Android, the same frameworks wrap Objective-C calls.
Unfortunately Java portability story is dead on mobiles, when targeting multiple devices, because you cannot find proper VMs or native code compilers for all devices.
There are a few open source available like RoboVM and Aicas, but they still too buggy for production code.
The only one would be Codename One, but you need to be comfortable on relying on them for your applications and have the budget for it.
You are better off with C, C++, .NET(Mono) or PhoneGap than with Java when targeting native mobile applications on a budget.
Instead of suing Google, Oracle should pay their engineers to integrate the native code compiler from Java Embedded SDK into the standard JDK, if they want Java outside the Android world to still have a meaning on mobiles.
“Integrating” C into an Objective C is as easy as it gets. Even easier than “integrating” C into C++.
It is very easy to wrap common C/C++ code into a very small Objective C wrapper. Objective C is only needed to integrate into the target system. You always have to integrate with a target system. JNI is highly complex in comparison (not that it is hard).
The obvious solution would be to have multiple APKs for different architectures and for the Play Market to silently choose the correct one for your platform. Even better if all the non-platform specific stuff was in one APK and the binaries in the platform specific ones and when you tap Install in Play Market it installs the first-mentioned first and the platform specific one second — to the end-user the change wouldn’t even be visible.
I would be seriously grateful for some source on that info.
I hope they succeed. Competition benefits consumers more than fragmentation hurts them.
If Intel succeeds you will get the same situation you have now between Intel and AMD. So no in this case it would be better if there were no Intel. It is much better to have a few smaller players without a monopoly.
I don’t think I’d describe ARM as a “smaller player”.
You have:Arm, Quallcom, Apple, AMD , Nvidia. then you have the cheap Chinese players . None of them have a monopoly and as Intel showed only 5% of android apps need the arm instruction set.
Calling Arm a big player compared to Intel is laughable.
What, you mean Intel responding to the AMD threat from years ago and pumping out iteration after iteration of frankly superb CPUs? Before AMD were a credible threat, we had the likes of PentiumIII struggling to break the 1GHz barrier, followed by the P4 clusterfuck.
Once AMD pulled Intel’s pants down, they rallied and produced great new CPUs starting with Core 2 (Core was basically an enhanced PentiumIII).
With x86 and ARM battling over the tablet space, they’ll converge on high performance and low power designs. Without that competition, designs from either side will stagnate, as could be seen in the netbook space (Atom domination).
It’ll be interesting if or when ARM scales up in the performance stakes to start nipping at the heels of x86 on the laptop/desktop/server space. Interesting times ahead.
I hope they succeed, Intel has a lot better a history of building well documented and often open standards, creating platforms where things like Linux and other independent systems thrive. The ARM vendors for the most part are a mess of proprietary ad-hoc solutions that are seldom at all documented and never at all interoperable.
The PC is a fantastic achievement, it started with IBM blessing us all by being careless with their IP, but has continued under Intels stewardship with more and more standardized, documented and freely available standards for all things from buses to timers to booting procedures. I think it will be a great thing for us all if Intel brings that same approach to the mobile world, in the hopes that the ARM vendors will realize that this is a necessity if they want to compete.
edit: I think my use of the word “open” above is a bit controversial, but I do consider Intels approach with for example USB is a fantastic boon for everyone. It is hardly a matter of proper “Free”, but it is still fantastic, and not something the ARM vendors appear to have any interest in.
Edited 2013-06-05 12:23 UTC
Though I did not program any application in Android, the platform should not be a problem because almost all Android apps are written in Java and are running inside the Dalvik VM.
Now, the apps written using the NDK yes, must be ported to the new platforms to run, but I really suspect the vast majority of apps written for Android do not use it; and I also suspect that people that writes using the NDK, uses it to make their apps easily portable between Android and iOS, and in that case, their native code is already portable.
Edited 2013-06-05 12:31 UTC
That 5% are probally native code built using NDK. Guess what? It is just a matter of building it again for x86, not only a single line of code chance needed (unless you wrote the thing using asm)…
Not only asm, byte manipulation tricks that depend on endianess for example.
Hum, but is indian different in ARM and x86? I knew only that powerPC and x86 differ.
You can configure endianess on ARM.
In almost all cases, bit endianness should not affect you when doing bit processing.
Edited 2013-06-05 17:46 UTC
I just hope it is not to late.
My only hope is that this will spur more Linux development on tablet devices. While Android is ok, I would be much happier with a full blown Linux environment on my tablet.
Definitely. Open, reference plataforms so others can build on top of it. Not just Linux.
The world first tablet with a power-cord attached to it ! 🙂
What about battery life ?
Think about nVidia’s Tegra 3 or 4, and graphene batteries are coming.
Kochise
My Clover Trail Windows 8 tablet gets around 7-8 hours of battery life.
There are some similarities here that the mobile computing trend has created here. One is a seemingly unstoppable giant with a dominated market share. The other is a group of companies led by a third party that helps push them forward, but allows variation and competition. Only Intel probably has a stronger product and enough technology to roll over ARM. Hopefully that doesn’t happen and we get good competition with x86 ruling the high end and ARM the low.
Myself, as the owner of two Atom devices, am just hoping for a new Chromebook with battery life that can match my CR-48 without being so damn slow.
But the real question is: can people install a regular OS of their choice, or they are the usual locked down stuff?
If they are locked, give me a good reason to prefer them over ARM, please. Just one, I’m not asking too much…
That’s the point.
Usual locked down stuff.
I think that the sole reason to go for Intel in that case lies in performance. These atom processors outperform any currently deployed ARM on mobile space, and by a significant margin.
Ratio watt/mips ?
Kochise
Good ol’ raw performance in absolute numbers.
Watts per mips he will lose, i think. But perhaps (i didn’t look at comparison for this specific processor) the phone/tablet manufacturers are staring to believe that the processing power gain offset the higher power drain.
Don’t loose the sight that “mobile” devices should remain “mobile”, thus shouldn’t drain the battery unexpectedly. For a web browser, an office application or some light games (Dead Space) an ARM it enough.
For serious business and serious power though, your Core i7 based laptop should make it. Don’t think you’ll do everything on the tablet.
Kochise
I’m not sure why people hate on intel. Maybe its just because they’re big. Maybe its because they beat the crap out of AMD in the past. Cyrix anyone?
I see laughable unformed comments about how an intel CPU isn’t fast enough unless you “add a SSD”. I’m not sure how that’s not even just flagged down to oblivion for inaccuracy.
Intel hardware always has good driver support. It always has fully open source drivers for Linux, ahead of time.
Intel hardware always seems to just work.
Now, tell me the same about various ARM vendors, or even AMD. You can’t. They don’t provide open source drivers. Even the Mali GPU’s open source driver version is not proper.
I don’t like big companies more than anybody else, but I , for one, wouldn’t be all that unhappy if Intel took a part of the mobile CPU/GPU business. Maybe that’ll wake up other vendors so that they provide me with the same level of confidence regarding performance, quality, reliability of both hardware and software. And why not, making the actual proper main Linux drivers open source, all the time.
Edited 2013-06-06 05:14 UTC
The matter is more complicated. In the 80’s you had AMD doing faster processors than Intel (40 vs. 33 Mhz) but with the 486 DX Intel took the lead again thanks to its good FPU for 3D intensive games such like Quake.
The Wintel couple made really crappy steps to prevent other chip manufacturer to progress by entering the market. You mention AMD and Cyrix (Centaur bought by VIA) by there was others.
Intel try to push down desktop consumers’ throat its own HD Graphic line of “GPU” instead to open this -NOT CORE- business to more legitimate companies. The HD Graphic line is not really 3D capable, while the less powerful ATI chip is.
Intel have a fairly nice R&D (with their amount of cash in bank, they can) but AMD made really impressive steps forward like with the K6-III, the Athlon, the x86-64, HyperTransport, the APU (integrating the CPU and GPU) and deserves some fame as well.
The x86 architecture is probably one of the worst ever made, this whole segmented memory, non orthogonal ISA, register banking between the FPU and MMX, etc… while more robust and elegant design should have win based on a pure technical merit (SH, MIPS).
Sure Intel have opened its specs, they wouldn’t have nobody might have used their chips.
Kochise
Please people, stop confusing ISA with microarchitecture.
I speak about ISA, the assembler operand library available to the developper/compiler. The underlying microarchitecture, OOO, pipelines, whatever, Intel made a great job in that field.
Yet the ISA (operands and register allocation) remains one of the worse available.
Kochise
There is a danger in claiming a technology like an ISA is “worse” based on, mainly, subjective qualitative assessments like “elegant” or “robust.” Specially when RISCs are offered as counter example, then the irony meter goes up to 11. 😉
I know plenty of people who work on low level stuff who prefer x86 over ARM as a target ISA, for example.
Perhaps it comes down to what one was exposed to during the formative years. There is a whole generation of people raised on the Paterson/Henessy register-to-register bible.
You should show me who prefer x86 and its limited non orthogonal register allocation (AX, BX, …) vs. full orthogonal processing on ARM Rx registers, and its exceptionally good conditional execution. Its 16 bits Thumb ISA. Etc… OK, it lacks of MUL/DIV in early versions, no FPU in many incarnations, but it works pretty well.
Coding/debugging the x86 in assembly, on the other hand…
Kochise
At least two of them work on compilers, actually.
ARM’s ISA can be as easily criticized. E.g. Even though it’s RISC some implementations break many loads and jumps into multiple instructions (due to truly half assed encodings). Some criticism can also be targeted towards predication issues. Bit-twiddling approaches in ARM can also get monumentally “ugly.” Etc.
As I said, it depends on the culture of the developer. In my experience, people who have no issue grasping stack/accumulator machines, can deal with x86 just fine. Whereas others raised to equate microprocessors with register-to-register machines have a hard time wrapping their head around it.
Orthogonality is another of those things. Some people have a quasy religious expectation of a flat architectural register file just because, whereas other programmers deal with banked register groups as being easier for explicit scheduling optimizations.
Furthermore, the OO and AMD64 extensions have made a lot of the issues moot for a long while. Alas…
Edited 2013-06-06 21:05 UTC
The ARM “bit-twiddling” is yet effective, especially if it can be done in the same instruction instead to explode this into multiple instructions with a branch to select the path. More compact, no prediction.
The OO and AMD64 are nice, they are 21st century technologies though. And you want to put this into a tablet ? I know times will come, but right now, I think it’s a little too early.
Kochise
That proves my point, what you see as “effective” other people see as a “kludge.” It works both ways.
Also OO is definitively not a “XXI” century technology. And the new Atoms that are going to go against ARM do support it. And they do so supporting 64bit operation as well, which ARM has yet to do. ARM has also broken a lot of software compatibility across implementations, whereas x86 has been remarkable in that regard.
We can pick and choose the good while ignoring the bad of either architecture all we want. But that simply solidifies my point: subjective qualitative assessments of technologies are silly.
“it can be done in the same instruction instead to explode this into multiple instructions with a branch to select the path. More compact, no prediction.”
This is not subjective. The ARM ISA is made that way for obvious pragmatical reasons.
Kochise