May 05 AT 2:40 PM Taylor Wimberly 17 Comments

Moorestown ready to slay the Snapdragon?

Intel might be the world’s largest semiconductor chip maker, but when it comes to smartphones they have yet to flex their muscles. Some might say they were caught off guard by the popularity of the iPhone and the wave of smartphones which followed it.

Way back in 2005, Intel directed their team to target mobile internet devices first and then move on to smartphones. What was the result? Almost every smartphone around now uses an ARM-based processor and not Intel’s x86 architecture.

Fast forward to today and Intel is finally unveiling their new Moorestown platform (Atom Z600), which they claim will be the world’s fastest when it launches later this year. The smartphone version of the chip will clock up to 1.5 GHz and includes Intel’s new GPU the GMA 600, which tops out at 400 MHz.

The Atom Z600 is a radically different approach to the current ARM architecture, so it is hard to give an apples to apples comparison, but Intel’s latest chipset is all about speed while maintaining an acceptable battery life.

I’m no microprocessor guru, so I’ll refer to my friend Anand who has produced a 16-page report on the Atom Z600. If you are into the all the details behind what makes Moorestown click, this is a must read.

“Thanks to an incredible amount of integration, power management and efficiency Moorestown has the potential to be the most exciting thing to hit the smartphone market since the iPhone.”Anand Lal ShimpiAnandtech

It all sounds too good to be true, so we will have to wait for final hardware to appear in order to get an accurate performance comparison. Intel is expected to ship the Atom Z600 processor in the second half of this year, so it’s entirely possible you could purchase an Android-powered Moorestown smartphone this Christmas.

Based on the current rumors and speculation, a Google TV product might be the first device to sport the new Moorestown platform. Google is said to be working with Intel and Sony and their prototype device is said to include an Atom processor.

No actual smartphones have been announced, but Intel was showing off the Aava Mobile device we saw at CES. Engadget was at the Intel event and managed to snap a photo of the Aava Virta specs, which give us a good idea of what to expect.

How do these specs look?

Finally, the last thing I will note is the interesting chipset battle shaping up between Google and Apple. Intel has said they have no plans to support Windows Phone 7 (or 8) with the first Atom Z600 processor, so Android is going to be the focus here. Apple has been rumored to be considering a purchase of ARM Holdings, which was debunked by many, but anything is possible in the tech sector.

Google and Intel vs Apple and ARM? More details coming later this year.

Via: AnandTech

Source: Intel

Taylor is the founder of Android and Me. He resides in Dallas and carries the Samsung Galaxy S 4 and HTC One as his daily devices. Ask him a question on Twitter or Google+ and he is likely to respond. | Ethics statement

    Most Tweeted This Week

  • Lane

    I don’t think Apple purchasing ARM would ever be approved by the US or EU.

    I am more concerned about what having both Intel and ARM devices will mean for the future of the Android NDK.

    • http://androidandme.com Taylor Wimberly

      Yeah it should be a huge shakeup. It sounds like Google and Intel have been in bed for awhile now with the Google TV service so I expect a future version of Android will be highly optimized for Moorestown.

  • http://Website Darkseider

    LOL. Yeah great numbers there Intel. The chances of these things ever coming CLOSE to power/performance of a dual core Cortex A-9 is slim to none. The dual core Cortex A-9 will give a 3x – 4x boost in performance to a similarly clocked Cortex A-8 SoC at a lower power cost. Take this for what you will but by the time we see any of these Intel vaporchips in a smartphone the dual core Cortex A-9 will be well established and spanking it in performance.

    • http://Website Darkseider

      OK. I just looked at that spec sheet. LOL. 10 days standby! LOL! I have heard of pie in the sky numbers but those are just plain ridiculous. Now I personally can’t wait for them to come out so I can chuckle.

      • http://Website kidphat

        I wouldn’t be surprised if that number is true (doubtful, but plausible). Intel has a very aggressive standby power consumption assuming you get into the lowest sleep state. At the lowest sleep state, the core is basically utilizing little to no power (no clock ~= no power). Where Intel struggles is active power consumption.

        • http://Website bjtheone

          The real issue is that smartphones almost never get to the “lowest sleep state”. There are a number of real life tests out there… Chippy over at umpcportal and carrypad has written about this a number of times. With all the background processes and data syncing that goes on, your don’t get any where close to the theoretical battery life that you could if the cpu could truly idle, plus the battery cost of driving the screen is a significant percentage of the overall consumption.

          Bottom line, if you actually use the phone you will never see 10 days standby with a smartphone. At least not using one with a normally size battery pack. True one day battery life is about all you are ever going to get.

          What will be more interesting is the power consumption when running flat out. Flash and video decoding are big battery killers.

  • http://plankhead.com Zacqary Adam Green

    That’s a really smart move on Intel’s part not to devote resources to Smiley-Face-With-Sunglasses OS. It may look pretty, but realistically nobody’s going to actually use it.

  • Pingback: Standroid()

  • Pingback: Moorestown ready to slay the Snapdragon? - Droid Forum - Verizon Droid & the Motorola Droid Forum()

  • mmitchum

    Man! It’s developments such as these that make me think twice about getting a new phone. As soon as I talk myself into buying the EVO, I learn about the dual core chip and then this surfaces. Makes me giddy!

  • http://Website hector

    How do that specs look? AMAZING O_O

  • http://www.twitter.com/alastair_hm alastair_hm

    Competition in the mobile chip market can only be a good thing?

    • jeanpaul145

      Competition can only be a very good thing. However, IMO this is as much fragmentation as it is competition, and fragmentation is bad, mmm’kay?

  • http://Website Daniel

    I dont know much about much but correct me if I am wrong, Android is based on a Linux kernal running Custom Google JAVA.

    Doesnt this mean that any application runs on a non hardware compiled binary that then gets the OS to have an embedded JAVA translator to do the work talking to the hardware? I know I might have my terminologies wrong here but the concept is solid. In uni where I used java (the only time i have ever used it) we could port applications from one hardware platform to the next with very little to no application changes as the hardware specific translators (an extra layer in a java system vs other code) did all the hardware specific stuff.

    To this logic I would presume both arm and x86 and even x64 hardware platforms would not need a seperate version of a java application, just a seperatly written (one time as part of the OS) translator.

    Can explain to me where I went wrong with this?

    • jeanpaul145

      First things first: as you’re obviously aware, Java and Android are 2 separate entities; Android code simply uses Java syntax, and seemingly Java’s byte code as an interim step, but that’s all it uses as far as I’m aware. It is useful to keep the 2 separate in your mind.

      Sun’s Java uses a JIT-compiler to go from java bytecode to something the cpu can understand, and it’s this JIT compiler (which lives in the java-jre) that’s platform specific, next to the (non-JIT) compiler itself. So indeed, if you code properly no source adjustments are needed to make a Java app written on Windows run on Linux or Mac OS X.

      Android works a bit differently: It does have a bytecode representation, but atm it isn’t JIT-compiled; This will hopefully change with Android 2.2.

      So basically with Moorestown it isn’t the standard Android code that’ll be in “trouble”. There is also something called the NDK (native development kit) where you can use code in C\C++ and then make Android bindings for it. C\C++ is, at least in compiled-and-linked form, as platform-specific as can be, since it’ll not only be dependent on the cpu arch but also on the binary representation that the OS uses and any linked libraries. This means that at the very least, NDK using apps will get a lot bigger when Moorestown is released, kind of like Mac OS X’s universal binaries; Al least if they want to be able to run on all android devices.

  • Pingback: What kind of CPU and GPU will your first Google TV have? – Android and Me()

  • Pingback: Intel on mobile, “2 years ahead of the rest of the industry” – Android and Me()

  1. I don’t think Apple purchasing ARM would ever be approved by the US or EU.

    I am more concerned about what having both Intel and ARM devices will mean for the future of the Android NDK.

    • Yeah it should be a huge shakeup. It sounds like Google and Intel have been in bed for awhile now with the Google TV service so I expect a future version of Android will be highly optimized for Moorestown.

  2. DarkseiderGuest 5 years ago

    LOL. Yeah great numbers there Intel. The chances of these things ever coming CLOSE to power/performance of a dual core Cortex A-9 is slim to none. The dual core Cortex A-9 will give a 3x – 4x boost in performance to a similarly clocked Cortex A-8 SoC at a lower power cost. Take this for what you will but by the time we see any of these Intel vaporchips in a smartphone the dual core Cortex A-9 will be well established and spanking it in performance.

    • DarkseiderGuest 5 years ago

      OK. I just looked at that spec sheet. LOL. 10 days standby! LOL! I have heard of pie in the sky numbers but those are just plain ridiculous. Now I personally can’t wait for them to come out so I can chuckle.

      • kidphatGuest 5 years ago

        I wouldn’t be surprised if that number is true (doubtful, but plausible). Intel has a very aggressive standby power consumption assuming you get into the lowest sleep state. At the lowest sleep state, the core is basically utilizing little to no power (no clock ~= no power). Where Intel struggles is active power consumption.

        • bjtheoneGuest 5 years ago

          The real issue is that smartphones almost never get to the “lowest sleep state”. There are a number of real life tests out there… Chippy over at umpcportal and carrypad has written about this a number of times. With all the background processes and data syncing that goes on, your don’t get any where close to the theoretical battery life that you could if the cpu could truly idle, plus the battery cost of driving the screen is a significant percentage of the overall consumption.

          Bottom line, if you actually use the phone you will never see 10 days standby with a smartphone. At least not using one with a normally size battery pack. True one day battery life is about all you are ever going to get.

          What will be more interesting is the power consumption when running flat out. Flash and video decoding are big battery killers.

  3. Zacqary Adam GreenGuest 5 years ago

    That’s a really smart move on Intel’s part not to devote resources to Smiley-Face-With-Sunglasses OS. It may look pretty, but realistically nobody’s going to actually use it.

  4. StandroidGuest 5 years ago

    [...] Source Catégories: Informations & Nouvelles Mots-clefs: atom z600, gma 600, intel, snapdragon [...]

  5. Moorestown ready to slay the Snapdragon? - Droid Forum - Verizon Droid & the Motorola Droid ForumGuest 5 years ago

    [...] ready to slay the Snapdragon? Moorestown ready to slay the Snapdragon? – Android and Me Sounds great to me. Surprised no one had posted about this yet. __________________ Have a [...]

  6. Man! It’s developments such as these that make me think twice about getting a new phone. As soon as I talk myself into buying the EVO, I learn about the dual core chip and then this surfaces. Makes me giddy!

  7. hectorGuest 5 years ago

    How do that specs look? AMAZING O_O

  8. Competition in the mobile chip market can only be a good thing?

    • Competition can only be a very good thing. However, IMO this is as much fragmentation as it is competition, and fragmentation is bad, mmm’kay?

  9. DanielGuest 5 years ago

    I dont know much about much but correct me if I am wrong, Android is based on a Linux kernal running Custom Google JAVA.

    Doesnt this mean that any application runs on a non hardware compiled binary that then gets the OS to have an embedded JAVA translator to do the work talking to the hardware? I know I might have my terminologies wrong here but the concept is solid. In uni where I used java (the only time i have ever used it) we could port applications from one hardware platform to the next with very little to no application changes as the hardware specific translators (an extra layer in a java system vs other code) did all the hardware specific stuff.

    To this logic I would presume both arm and x86 and even x64 hardware platforms would not need a seperate version of a java application, just a seperatly written (one time as part of the OS) translator.

    Can explain to me where I went wrong with this?

    • First things first: as you’re obviously aware, Java and Android are 2 separate entities; Android code simply uses Java syntax, and seemingly Java’s byte code as an interim step, but that’s all it uses as far as I’m aware. It is useful to keep the 2 separate in your mind.

      Sun’s Java uses a JIT-compiler to go from java bytecode to something the cpu can understand, and it’s this JIT compiler (which lives in the java-jre) that’s platform specific, next to the (non-JIT) compiler itself. So indeed, if you code properly no source adjustments are needed to make a Java app written on Windows run on Linux or Mac OS X.

      Android works a bit differently: It does have a bytecode representation, but atm it isn’t JIT-compiled; This will hopefully change with Android 2.2.

      So basically with Moorestown it isn’t the standard Android code that’ll be in “trouble”. There is also something called the NDK (native development kit) where you can use code in C\C++ and then make Android bindings for it. C\C++ is, at least in compiled-and-linked form, as platform-specific as can be, since it’ll not only be dependent on the cpu arch but also on the binary representation that the OS uses and any linked libraries. This means that at the very least, NDK using apps will get a lot bigger when Moorestown is released, kind of like Mac OS X’s universal binaries; Al least if they want to be able to run on all android devices.

  10. What kind of CPU and GPU will your first Google TV have? – Android and MeGuest 5 years ago

    [...] hopes to one day dominate the smartphone market with their Moorestown platform, but their first design win for Android will be the line up of Google TV devices slated to [...]

  11. Intel on mobile, “2 years ahead of the rest of the industry” – Android and MeGuest 4 years ago

    [...] first mobile Atom processor debuted in 2010 and was called Moorestown, but it never made it into retail devices. Now the new and improved 32nm Medfield is ready to try [...]