Jan 25 AT 4:02 PM Taylor Wimberly 33 Comments

3 reasons why NVIDIA won round 1 of the multi-core wars, and why they might win the next couple rounds as well

I am a fan of mobile technology. That is the reason I was drawn to purchase the T-Mobile G1 and it’s also the reason I started this blog. Over the years I’ve authored some pretty opinionated articles and have been accused of being a fanboy for HTC, T-Mobile, and now NVIDIA, but the only club I am loyal to is Team Android.

It took me awhile to figure out my blogging niche, but I finally realized I’m a passionate early adopter who obsesses with the latest technology and this has caused me to travel the globe in search of the next big thing. I am honestly still trying to figure out the role I play in this demanding Android community, but you can be sure that I will continue to blog about the things I find exciting.

This month I returned to Las Vegas for CES with the hopes of experiencing the mobile computing revolution that everyone in the industry loves to talk about. Last year brought us 1 GHz processors and 4G networks, but I truly believe that 2011 will be the period that we all look back on in 5 years and agree that it changed everything.

My current obsession is with multi-core processors coming to mobile device and this is the reason I’ve done about 50 posts related to dual-core CPUs. During CES I met with NVIDIA, Qualcomm, and Texas Instruments to see what they were working on and all three companies showed me some pretty impressive demos. However, only one company got me excited about buying products that they powered.

If CES 2011 was round 1 of the multi-core wars for mobile devices, then NVIDIA won by a spectacular knockout. The following list details three simple reasons why I think NVIDIA will rise to the top producer of processors for high-end smartphones in 2011.

1. First dual-core matters to the early adopters

Looking back at CES, the best smartphones were powered by NVIDIA’s Tegra 2. This includes the LG Optimus 2X, Motorola Atrix 4G, and Motorola Droid Bionic. These new phones represent the next-generation of mobile devices and I believe the benefits are compelling enough for anyone to pull the trigger that is looking to upgrade their current Android phone.

It is too early to tell which companies’ dual-core offering might be the fastest, but I don’t really think it matters at this point. From what I was shown, all the first-generation of dual-core processors will offer similar experiences, features, and performance. NVIDIA might lead in some benchmarks while Qualcomm and Texas Instruments lead in others, but all the dual-core CPUs will offer significant improvements over their single-core predecessors.

So if the first wave of dual-core processors delivers a similar experience, there is no reason on waiting for the other companies to ship their chips. If you want a dual-core mobile device in Q1, buy a Tegra 2 smartphone.

Not only was NVIDIA the first to market with a mobile dual-core CPU, but they are looking to keep up that momentum with the next generation of Tegra chips. In our interview with Michael Rayfield of NVIDIA, he emphasized the velocity at which his company moves and said, “I believe I will have my next generation Tegra in production probably before my competitors have their dual-core in production.”

It looks like he wasn’t bluffing because we just learned of NVIDIA’s new Tegra 2 3D and quad-core Tegra 3 parts that should be coming later this year.

2. Premium content sells phones and makes money for the operators

Tegra zone

Since we know that most dual-core CPUs will be quite similar, each semiconductor company will need to make the extra effort to differentiate their offering. NVIDIA’s strategy is to push their Tegra Zone, which is an application that showcases all the premium games specifically optimized for the Tegra 2 processor.

We spoke with the game developers at CES and it was easy to see the excitement in their eyes when talking about working on the Tegra 2 platform (go watch the interviews – 1, 2, 3). The Tegra 2 includes an ultra low power (ULP) Geforce GPU that features a similar architecture to that of NVIDIA’s desktop GeForce GPUs. This means that game developers can use the same assets from their console and PC games, which greatly reduces the amount of work it takes to get a game up and running on Android.

Featured developers that have committed to support Tegra 2 include several big names like Electronic Arts, Epic Games, Gameloft, Glu, Factor 5, Trendy Entertainment, and many more.

As we saw with the Samsung Galaxy S and the iPhone 4, the fastest platform does not always get the best games. Samsung’s Galaxy S featured a faster GPU than the iPhone 4, but Apple’s phone was the platform that all the developers targeted. When it comes to console-quality Android games, Tegra 2 will be the platform that developers optimize their games for.

In 2011, the best Android games will be found on Tegra devices.

I asked Qualcomm and Texas Instruments to explain their strategy to differentiate, but neither company provided me anything that I could share. Qualcomm even went as far as to say, “We allow the operators to differentiate their devices. The most important customers for us are the phone makers and the operators, not the end-user who buys the phone.”

Clearly, you can see the different approach between NVIDIA and their competitors.

3. Google chose NVIDIA to power the first Honeycomb tablets

Finally, if you don’t believe my analysis then just ask Andy Rubin and Google. His team put a lot of time and effort into Honeycomb to make sure that it provided the best tablet experience for Android and they chose Tegra 2 as their development platform.

The Motorola Xoom will be the first Honeycomb tablet and now many other companies have adopted the Tegra 2 processor for their products so they can get to market faster. I am certain that Honeycomb will eventually be up and running on a Qualcomm Snapdragon or Texas Instruments OMAP4, but the first wave of Android 3.0 tablets will all be Tegra 2 powered.

Now that Google has a strong relationship with NVIDIA, it could result in the next generation of Tegra processors becoming the lead platform for future versions of Android. If NVIDIA actually delivers a quad-core 1.5 GHz Tegra 3 by Christmas, you can bet that Google and their manufacturing partners will want to get a piece of those holiday sales.

Conclusion

Please remember that this is only a snapshot in time, but right now NVIDIA appears to be in the best position among all the companies fighting to power the mobile computing revolution. I’ve received a ton of hater comments and emails for writing about Tegra 2, but it’s the only mobile processor that gets me excited about going out and purchasing my next Android phone. When someone else produces a better chip, you can bet I will write about it.

If anything, we should all be happy that NVIDIA is changing the mobile industry by forcing their competitors to speed up their release cycles. We should see a new Tegra every year (and maybe a refresh part each six months) so our mobile devices are going to become increasingly more powerful (and kick ass).

As we approach Mobile World Congress, I am anticipating a response from Qualcomm, Samsung, and Texas Instruments. NVIDIA wowed me and got me excited about buying the products they powered – Can you do the same?

Taylor is the founder of Android and Me. He resides in Dallas and carries the Samsung Galaxy S 4 and HTC One as his daily devices. Ask him a question on Twitter or Google+ and he is likely to respond. | Ethics statement

    Most Tweeted This Week

  • http://Website NeoteriX

    I’d say the stock market agrees with you. During CES, nVidia’s stock price shot up 50-60%.

    • http://Website 日独就航

      gotta ♥ nvidia

      • http://Website Matthew

        Thank god I bought some call options on the company, more than doubled my money in a few days. Too bad I sold early or I could have made 5x. Thanks Taylor for keeping us up-to-date on this company, it made me some money.

  • http://Website ivan

    i too am a big fan of mobile tech. But in case you dont know, Qualcomm’s Adreno 3xx GPU’s graphics compare to those graphics of an Xbox 360 and PS3, according to Qualcomm. That is pretty damn impressive I might say. I also do like Tegra 2, But I dont know much about their GPU, which from what I have read is reffered to as ULP (Ultra Low Power). PoverVR SGX 540 can make 90 million triangles/sec. How many triangles/sec can Tegra 2 make?

    • http://Website Lucian Armasu

      Adreno 205 has 80 mil pol/s and yet its real performance is half that of PowerVR SGX540. Adreno 200 was supposed to have 22 mil pol/s, yet I’ve never seen a game that looked better on it than it did on an iPhone with PowerVR SGX 535 with 7 mil pol/s. Bottom line is those are just specs on paper. What matters most are the code optimizations for that chip, both from Nvidia which is known to make the best graphics drivers, and also from game developers. And if things happen the way Taylor says, then most developers will optimize for Tegra anyway.

      Tegra 2 is already said to offer console-like graphics. Tegra 3 will have 3x the graphics performance of Tegra 2 and it’s coming in the same time Adreno 300 is coming. Someone was saying that Tegra 3′s 13800 MIPS is the same as a PS3 (CPU wise).

      Qualcomm bragged about Adreno 205, too, and it failed to deliver. I wouldn’t put much faith into what they have to say until we see it for ourselves (and not in Quadrant or Neocore, obviously)

      • http://Website Arion

        You are mistaken..the Adreno 205 was meant around 40 million polygons..not 80..

        Now as for the Adreno 300..its competitor is not the Tegra 2, the Tegra 2 and Tegra 2.5 will compete against the Adreno 220

        The Adreno 300 is going to go against the Tegra 3 already..which is quad-core…

        Right now Nvidia has the upper hand, though at the moment no device supports quad-core until Honeycomb comes…this may give advantage on the Tablets but when will phones get Honeycomb as they have yet to get Gingerbread…

        While I think Nvidia got themselves a nice position overall and neither Qualcomm nor TI are ready to take the Tegra 2 on…I would though like to see how the Orion would stack as it will have a MALI-400 which is the official GPU of ARM so it might get some support…

        Other front runners is Intel with their Moorstown and I wonder what AMD is going to do in that market..they sold their ATI Imagon tech to Qualcomm but how long can they go not tapping that market >.>

        • http://Website Arion

          “no device supports quad-core until Honeycomb comes”

          meant to say “dual core”

          “no device supports dual-core until Honeycomb comes”

          Can’t blame me for drooling over the quad-cores can you ;)

  • http://emuneee.com Evan

    The time to market for NVIDIA’s Tegra 2 was beneficial for them. I think this will allow them to establish a pretty good foothold on the market.

    Whenever Qualcomm makes to market with their solution, I think they’ll have the advantage (on paper anyway) because of their ability to integrate baseband logic into their SoC which of course drops the price of a Qualcomm dual core based handset vs. the competitors. Also their experience in building their SoCs from scratch will probably give them the power efficiency advantage as well.

    Either way, 2011 sounds like it’ll be an exciting year.

  • http://Website maxisma

    I agree with you. Very good article!

    Also NVidia is a well-known brand, unlike Qualcomm or TI; at least from an end-user point of view.

    Manufacturers are able to advertise their phones as gaming phones, because of the integrated NVidea Tegra.

  • http://Website Mark

    This is all good and dandy but WE NEED GAMES TO TAKE ADVANTAGE OF ALL THIS POWER!

  • http://Website dmass

    Um the only reason nvidia won is because they were there at the right time (huge marketing/sales push probably didn’t hurt either).

    I don’t see how a multicore proc will help in a smartphone though (asside from games) so I don’t know what you’re so excited about unless horrible battery life makes you randy.

    Anyway all these SoCs are based on the same ARM Cortex-A9 reference and will perform the same so it’s not a big deal.

    • http://Website TGeezy86

      I see they downranked your comment but I gotta agree. There ARE obvious benefits to multicore chips in a phone…games, and few things besides. I fail to see how a dualcore phone will BLOW my myTouch 4G away in multitasking cause honestly…you still can only actively use ONE APP at a time. And almost in every scenario, two cores both running at 1Ghz will use more power than a singlecore at the same speed..

      • http://Website J.

        Well, see…here’s where I have to disagree.

        Dual core will make a difference in speed, multi-tasking, and actually improve battery life (as long as code is optimized to take advantage of the multiple cores).
        It’s all about uptime and instruction buffers.
        For the sake of example, lets expand all of our scales by a few factors of 10 –
        Imagine that running a processor @ 1000hz for 1 minute takes 1 watt of energy (bear in mid also that there are conversion factors and units of measure that are being left out of this example)
        two requests for processing hit the chip at the same time, each requires 30 seconds to complete. Keeping the second request in the instruction buffer for 30 seconds (while the first instruction processes) will require 0.25 watts (or 0.5 watts per minute). So your total wattage for this scenario is 1.05 watts/minute.

        Now, when a dual core @ 1000hz uses 1.5 watts for one minute of work and those same instructions come in, each core gets one instruction, processes them in parallel and your total wattage is only 0.75 watts.

        Also, while you can only ACTIVELY use a single app at a time, there is a bunch of stuff going on in the background that could very handily use the dual core option (GPS/WiFi snooping/Sync/pretty much anything that operates on an “event” intent)
        Trust me, it might not completely blow away your MT4G, or my G2, but it will be more than an incremental step forward, for certain.

        • http://Website TGeezy86

          I do agree with you to an extent. But two cores running at 1Ghz will be more power efficient when performing the same tasks as a single core, but unless they’re downclocked below the single core AND the software is optimized then they WILL use more power overall than the single. And thats the catch: Android itself isn’t optimized for multicore processing (yet) and even if certain apps and games are, the apps don’t manage themselves so the improvement in multitasking and background processes will be minimal at best. Step in the right direction, yes. Worth throwing my MT4G away to upgrade to the Tegra 2? Not likely.

  • http://Website erfs

    I’m sure there are a couple of semi-valid reasons some people are pelting you for your love affair with Nvidia. First, the original tegra was an unmitigated disaster (though I’m not sure they were actually planning on getting any design wins with it anyway), which Nvidia hyped ridiculously to the moon, really really overhyped. That was of course before you came onto the scene or became familiar with Nvidia it seems. People are afraid that tegra 2 is going to be the same thing all over again, seriously hyped up tech with lackluster execution. It looks like T2 is going to do much better than T1 though so that might not be an issue.

    Second, Nvidia has a history of being a serious hype machine in the PC area. Many would say JH oversells more than any other tech CEO, and they have a way with the press. I’m sure some of the people writing you about Nvidia are noting you are kind of doting on them the way Walt Mossberg dotes on Apple. There’s nothing wrong with that, but you have to be upfront about objectivity. In this article you seem to cover that OK at the top. Certain people in the PC space feel that Nvidia has a way of wooing and then ultimately pressuring the press once they have them where they want them. I think people don’t want you falling into that same trap. I personally don’t have an issue with it, but some of your Nvidia posts do sound a bit overzealous towards Nvidia.

    I am actually inclined to agree slightly with Qualcomm; their customers are phone makers. Traditional phone users have bought based on the brand of the phone manufacturer – a Samsung or an Apple or a Motorola – whereas those companies use all kinds of different chips inside their phones. It seems that we are heading into an era where phones are more like PCs in terms of now users are more interested in the individual sub components in them. Even if that is the case, overexcitement towards Nvidia because they are a PC brand seems rather foolish – they may be a PC brand, but they have no track record in the phone space: they are not a phone brand, yet – and their last mobile chip can be found in all of about two devices, one of them discontinued about two weeks after being released. I’m taking a wait and see attitude with Nvidia and mobile: I will believe it when I see it. At the same time, let’s hope that the entrance of this PC component maker into the phone arena doesn’t bring with it all the problems that Nvidia has been facing in the PC space over the past few years (bumpgate).

    All that said, I think T2 will be a winner with the highly tech inclined such as those of us that read this site, but TI and Qualcomm by no means will be marginalized by Nvidia.

    • http://Website labrat

      Very interesting post, thanks for sharing!

      I have a feeling that nVidia will succeed with the T2.

      The T1 wasn’t anything special and it was late to the party with and old ARM11 core. That being said the battery life was pretty impressive on the Zune. By choosing to leapfrog the CortexA8, nVidia made a very smart move and should see the benefits in 2011.

      About nVidia not being a mobile company, I am not sure this is that much of an issue. By now they have a few years of experience with mobile. Additionally they have many years of experience designing chips, including some of the most complex one ever produced. I am confident their mobile GeForce can be competitive if only for their expertise in GPU and the recent trend to optimize for power on the desktop (even though the nVidia cores are still pulling tons of power). Since they are using the reference CortexA9 design, CPU shouldn’t be an issue especially since they will not have to deal with the manufacturing.

      Will nVidia dominate the dual core market? I highly doubt it. Will they be competitive with the established player? I think they can pull it off. At the very least, it is nice to see a third graphic option thrown into the Andreno/PowerVR mix.

    • http://Website Drago

      erfs,

      Nvidia may be overestimating their product, which may not be as good as they claim, I agree. But when a phone manufacturer advertises a certain talk/standby time in a phone sold to end-users, they cannot brag too much or else they’ll be hit with a class action or something. So when, despite dual cores in Tegra 2 in phones already announced, LG and Moto say they will have specific talk/standby time (not worse than single-chip phones), they must have tested this to be sure it is correct. That’s why I share Taylor’s excitement: we see Tegra 2 in actual devices imminently to be put on the market, and that gives them some credibility. Of course, the final say will be when we can actually use those new phones/tablets.

      The combination of reference device for Google and apparently reference device for games/apps developers is a very powerful one, ensuring the maximum optimization of code for such devices using Tegra 2.

    • http://Website erfs

      I think both of those comments make sense. I will be interested to see the battery life as well. That is one of the reasons I am playing wait and see. You’re right that battery life is a fairly tightly controlled metric in the phone world. However, even then, with the advent of smartphones, we’re already getting into the PC mindset, where none of us actually expect to get the stated battery life from our phones.

      One of the ways manufacturers get away with bad battery life, but quote high ones in the specs is by quoting only two metrics: talk time and standby time. Neither of those are the actual 90% usage scenario for a smartphone. Smartphone users usually spend much more time browsing online, using apps like maps or mail, texting, and playing games than actually talking. Some of these activities will take less power than talking, some actually take a whole lot more. The other scenario, letting your phone sit until it discharges is of course not very telling either.

      Phone makers now have the luxury left from the dumbphone era, of only quoting talk time, and the end users have to wait for reviews to read things like “we were able to get 1.5 to 2 days use out of the phone with light browsing, a couple of phone calls, and the occasional round of angry birds.” moral, wait for reviews before believing battery life claims.

  • http://Website irishrally

    Orion, please.

  • http://Website bob

    Who is providing the base band for T2 based phones/tablets. Used to be Infinieon wireless but they were bought out by INTC? Does Nvida need to at some point integrate the baseband to get price competitive?

  • http://Website Jeffroid

    guys, do you all think I should get a Nexus S or wait for dual-core phones to reach my country? I’m in a dilemma

    • http://Website Scoch

      You should wait for quad core phones :)

    • http://Website Craigboy

      Get the Nexus

    • http://Website yo

      It depends on how soon you need a new phone and what you want it for. If you need to get a new phone today, go with the nexus s. Its easily the best phone on the market today, you will not be disappointed. Its a great phone and will continue to get great support from google, even when dual cores are available. On the other hand, if you want a top end gaming phone and you can wait a few months, I would wait and see what dual core phones are available in your country and how people like those phones before you make your decision. Thats what I’m doing.

      I have an N1, its been a great phone but I am ready for an upgrade (for a better touch screen, more storage, 4G, etc.) I almost bought the NS out of pure impulse. I went to best buy 3 times to play with it and loved it but did not pull the trigger, and I am glad I didn’t. I am hanging onto the N1 for now, I may eventually get the NS but I want to see what phones are announced at MWC and CTIA.

      • http://Website Jeffroid

        thanks for all the great replies! I do need a new phone soon. Mobile gaming is secondary to me. I use the phone to facebook/twitter and surf net alot though. So I guess I MIGHT get a Nexus S eventually..

  • http://Website Jon

    I’ve always thought Androidandme’s niche is being the most careful and thorough with Android stories rather than being the first to press. Every other blog is an echo chamber of its peers; AAM is an independent voice.

    • http://androidandme.com Taylor Wimberly

      Thank you for the kind words. It’s nice to see some positive feedback.

    • http://clarklab.net Clark Wimberly

      Thanks Jon!

  • http://Website Derek

    quote “As we saw with the Samsung Galaxy S and the iPhone 4, the fastest platform does not always get the best games. Samsung’s Galaxy S featured a faster GPU than the iPhone 4, but Apple’s phone was the platform that all the developers targeted”

    I’d have to beg to differ slightly with you. The reason why devs create more games for iphone4 is the SDK is miles ahead of the Google SDK. Also, the OS itself is better than Android. iOS is actually a real OS and the apps are real programs written in Obj C that run on the CPU. Android is just a big fancy java runtime environment built on top of a linux kernel. Everybody knows java is crap as a programming language where performance matters. Thats why games run soooo much better and faster on an iphone4 vs my Captivate which is almost identical hardware. Samsung Humingbird and Apple A4 are darn near identical CPU’s while the Hummingbird has a faster GPU (SGX 540 vs SGX 535) and the Captivate has a lower resolution screen (800×480 vs 960×640) yet games on iphone4 (with a slower GPU and higher screen res) run much, much better than on Captivate. I’ve played gameloft games on iphone4 and compared them to gameloft games on Captivate…..Not even close. iphone4 flies, while some of the games stutter on Captivate.

    Android is going to need all the processing power it can get to make java apps run smoother.

    • http://androidandme.com Taylor Wimberly

      I think the updated Android NDK and new tools like the Unreal Engine will greatly help game devs.

  • http://Website UniqueNate

    What do you think about the rumored bug for your lovely device? Hope it’s not true or as serious as it sounds in your case. http://bit.ly/fYFxrB

  1. NeoteriXGuest 4 years ago

    I’d say the stock market agrees with you. During CES, nVidia’s stock price shot up 50-60%.

    • 日独就航Guest 4 years ago

      gotta ♥ nvidia

      • MatthewGuest 4 years ago

        Thank god I bought some call options on the company, more than doubled my money in a few days. Too bad I sold early or I could have made 5x. Thanks Taylor for keeping us up-to-date on this company, it made me some money.

  2. ivanGuest 4 years ago

    i too am a big fan of mobile tech. But in case you dont know, Qualcomm’s Adreno 3xx GPU’s graphics compare to those graphics of an Xbox 360 and PS3, according to Qualcomm. That is pretty damn impressive I might say. I also do like Tegra 2, But I dont know much about their GPU, which from what I have read is reffered to as ULP (Ultra Low Power). PoverVR SGX 540 can make 90 million triangles/sec. How many triangles/sec can Tegra 2 make?

    • Lucian ArmasuGuest 4 years ago

      Adreno 205 has 80 mil pol/s and yet its real performance is half that of PowerVR SGX540. Adreno 200 was supposed to have 22 mil pol/s, yet I’ve never seen a game that looked better on it than it did on an iPhone with PowerVR SGX 535 with 7 mil pol/s. Bottom line is those are just specs on paper. What matters most are the code optimizations for that chip, both from Nvidia which is known to make the best graphics drivers, and also from game developers. And if things happen the way Taylor says, then most developers will optimize for Tegra anyway.

      Tegra 2 is already said to offer console-like graphics. Tegra 3 will have 3x the graphics performance of Tegra 2 and it’s coming in the same time Adreno 300 is coming. Someone was saying that Tegra 3′s 13800 MIPS is the same as a PS3 (CPU wise).

      Qualcomm bragged about Adreno 205, too, and it failed to deliver. I wouldn’t put much faith into what they have to say until we see it for ourselves (and not in Quadrant or Neocore, obviously)

      • ArionGuest 4 years ago

        You are mistaken..the Adreno 205 was meant around 40 million polygons..not 80..

        Now as for the Adreno 300..its competitor is not the Tegra 2, the Tegra 2 and Tegra 2.5 will compete against the Adreno 220

        The Adreno 300 is going to go against the Tegra 3 already..which is quad-core…

        Right now Nvidia has the upper hand, though at the moment no device supports quad-core until Honeycomb comes…this may give advantage on the Tablets but when will phones get Honeycomb as they have yet to get Gingerbread…

        While I think Nvidia got themselves a nice position overall and neither Qualcomm nor TI are ready to take the Tegra 2 on…I would though like to see how the Orion would stack as it will have a MALI-400 which is the official GPU of ARM so it might get some support…

        Other front runners is Intel with their Moorstown and I wonder what AMD is going to do in that market..they sold their ATI Imagon tech to Qualcomm but how long can they go not tapping that market >.>

        • ArionGuest 4 years ago

          “no device supports quad-core until Honeycomb comes”

          meant to say “dual core”

          “no device supports dual-core until Honeycomb comes”

          Can’t blame me for drooling over the quad-cores can you ;)

  3. EvanGuest 4 years ago

    The time to market for NVIDIA’s Tegra 2 was beneficial for them. I think this will allow them to establish a pretty good foothold on the market.

    Whenever Qualcomm makes to market with their solution, I think they’ll have the advantage (on paper anyway) because of their ability to integrate baseband logic into their SoC which of course drops the price of a Qualcomm dual core based handset vs. the competitors. Also their experience in building their SoCs from scratch will probably give them the power efficiency advantage as well.

    Either way, 2011 sounds like it’ll be an exciting year.

  4. maxismaGuest 4 years ago

    I agree with you. Very good article!

    Also NVidia is a well-known brand, unlike Qualcomm or TI; at least from an end-user point of view.

    Manufacturers are able to advertise their phones as gaming phones, because of the integrated NVidea Tegra.

  5. MarkGuest 4 years ago

    This is all good and dandy but WE NEED GAMES TO TAKE ADVANTAGE OF ALL THIS POWER!

  6. dmassGuest 4 years ago

    Um the only reason nvidia won is because they were there at the right time (huge marketing/sales push probably didn’t hurt either).

    I don’t see how a multicore proc will help in a smartphone though (asside from games) so I don’t know what you’re so excited about unless horrible battery life makes you randy.

    Anyway all these SoCs are based on the same ARM Cortex-A9 reference and will perform the same so it’s not a big deal.

    • TGeezy86Guest 4 years ago

      I see they downranked your comment but I gotta agree. There ARE obvious benefits to multicore chips in a phone…games, and few things besides. I fail to see how a dualcore phone will BLOW my myTouch 4G away in multitasking cause honestly…you still can only actively use ONE APP at a time. And almost in every scenario, two cores both running at 1Ghz will use more power than a singlecore at the same speed..

      • J.Guest 4 years ago

        Well, see…here’s where I have to disagree.

        Dual core will make a difference in speed, multi-tasking, and actually improve battery life (as long as code is optimized to take advantage of the multiple cores).
        It’s all about uptime and instruction buffers.
        For the sake of example, lets expand all of our scales by a few factors of 10 –
        Imagine that running a processor @ 1000hz for 1 minute takes 1 watt of energy (bear in mid also that there are conversion factors and units of measure that are being left out of this example)
        two requests for processing hit the chip at the same time, each requires 30 seconds to complete. Keeping the second request in the instruction buffer for 30 seconds (while the first instruction processes) will require 0.25 watts (or 0.5 watts per minute). So your total wattage for this scenario is 1.05 watts/minute.

        Now, when a dual core @ 1000hz uses 1.5 watts for one minute of work and those same instructions come in, each core gets one instruction, processes them in parallel and your total wattage is only 0.75 watts.

        Also, while you can only ACTIVELY use a single app at a time, there is a bunch of stuff going on in the background that could very handily use the dual core option (GPS/WiFi snooping/Sync/pretty much anything that operates on an “event” intent)
        Trust me, it might not completely blow away your MT4G, or my G2, but it will be more than an incremental step forward, for certain.

        • TGeezy86Guest 4 years ago

          I do agree with you to an extent. But two cores running at 1Ghz will be more power efficient when performing the same tasks as a single core, but unless they’re downclocked below the single core AND the software is optimized then they WILL use more power overall than the single. And thats the catch: Android itself isn’t optimized for multicore processing (yet) and even if certain apps and games are, the apps don’t manage themselves so the improvement in multitasking and background processes will be minimal at best. Step in the right direction, yes. Worth throwing my MT4G away to upgrade to the Tegra 2? Not likely.

  7. erfsGuest 4 years ago

    I’m sure there are a couple of semi-valid reasons some people are pelting you for your love affair with Nvidia. First, the original tegra was an unmitigated disaster (though I’m not sure they were actually planning on getting any design wins with it anyway), which Nvidia hyped ridiculously to the moon, really really overhyped. That was of course before you came onto the scene or became familiar with Nvidia it seems. People are afraid that tegra 2 is going to be the same thing all over again, seriously hyped up tech with lackluster execution. It looks like T2 is going to do much better than T1 though so that might not be an issue.

    Second, Nvidia has a history of being a serious hype machine in the PC area. Many would say JH oversells more than any other tech CEO, and they have a way with the press. I’m sure some of the people writing you about Nvidia are noting you are kind of doting on them the way Walt Mossberg dotes on Apple. There’s nothing wrong with that, but you have to be upfront about objectivity. In this article you seem to cover that OK at the top. Certain people in the PC space feel that Nvidia has a way of wooing and then ultimately pressuring the press once they have them where they want them. I think people don’t want you falling into that same trap. I personally don’t have an issue with it, but some of your Nvidia posts do sound a bit overzealous towards Nvidia.

    I am actually inclined to agree slightly with Qualcomm; their customers are phone makers. Traditional phone users have bought based on the brand of the phone manufacturer – a Samsung or an Apple or a Motorola – whereas those companies use all kinds of different chips inside their phones. It seems that we are heading into an era where phones are more like PCs in terms of now users are more interested in the individual sub components in them. Even if that is the case, overexcitement towards Nvidia because they are a PC brand seems rather foolish – they may be a PC brand, but they have no track record in the phone space: they are not a phone brand, yet – and their last mobile chip can be found in all of about two devices, one of them discontinued about two weeks after being released. I’m taking a wait and see attitude with Nvidia and mobile: I will believe it when I see it. At the same time, let’s hope that the entrance of this PC component maker into the phone arena doesn’t bring with it all the problems that Nvidia has been facing in the PC space over the past few years (bumpgate).

    All that said, I think T2 will be a winner with the highly tech inclined such as those of us that read this site, but TI and Qualcomm by no means will be marginalized by Nvidia.

    • labratGuest 4 years ago

      Very interesting post, thanks for sharing!

      I have a feeling that nVidia will succeed with the T2.

      The T1 wasn’t anything special and it was late to the party with and old ARM11 core. That being said the battery life was pretty impressive on the Zune. By choosing to leapfrog the CortexA8, nVidia made a very smart move and should see the benefits in 2011.

      About nVidia not being a mobile company, I am not sure this is that much of an issue. By now they have a few years of experience with mobile. Additionally they have many years of experience designing chips, including some of the most complex one ever produced. I am confident their mobile GeForce can be competitive if only for their expertise in GPU and the recent trend to optimize for power on the desktop (even though the nVidia cores are still pulling tons of power). Since they are using the reference CortexA9 design, CPU shouldn’t be an issue especially since they will not have to deal with the manufacturing.

      Will nVidia dominate the dual core market? I highly doubt it. Will they be competitive with the established player? I think they can pull it off. At the very least, it is nice to see a third graphic option thrown into the Andreno/PowerVR mix.

    • DragoGuest 4 years ago

      erfs,

      Nvidia may be overestimating their product, which may not be as good as they claim, I agree. But when a phone manufacturer advertises a certain talk/standby time in a phone sold to end-users, they cannot brag too much or else they’ll be hit with a class action or something. So when, despite dual cores in Tegra 2 in phones already announced, LG and Moto say they will have specific talk/standby time (not worse than single-chip phones), they must have tested this to be sure it is correct. That’s why I share Taylor’s excitement: we see Tegra 2 in actual devices imminently to be put on the market, and that gives them some credibility. Of course, the final say will be when we can actually use those new phones/tablets.

      The combination of reference device for Google and apparently reference device for games/apps developers is a very powerful one, ensuring the maximum optimization of code for such devices using Tegra 2.

    • erfsGuest 4 years ago

      I think both of those comments make sense. I will be interested to see the battery life as well. That is one of the reasons I am playing wait and see. You’re right that battery life is a fairly tightly controlled metric in the phone world. However, even then, with the advent of smartphones, we’re already getting into the PC mindset, where none of us actually expect to get the stated battery life from our phones.

      One of the ways manufacturers get away with bad battery life, but quote high ones in the specs is by quoting only two metrics: talk time and standby time. Neither of those are the actual 90% usage scenario for a smartphone. Smartphone users usually spend much more time browsing online, using apps like maps or mail, texting, and playing games than actually talking. Some of these activities will take less power than talking, some actually take a whole lot more. The other scenario, letting your phone sit until it discharges is of course not very telling either.

      Phone makers now have the luxury left from the dumbphone era, of only quoting talk time, and the end users have to wait for reviews to read things like “we were able to get 1.5 to 2 days use out of the phone with light browsing, a couple of phone calls, and the occasional round of angry birds.” moral, wait for reviews before believing battery life claims.

  8. irishrallyGuest 4 years ago

    Orion, please.

  9. bobGuest 4 years ago

    Who is providing the base band for T2 based phones/tablets. Used to be Infinieon wireless but they were bought out by INTC? Does Nvida need to at some point integrate the baseband to get price competitive?

  10. JeffroidGuest 4 years ago

    guys, do you all think I should get a Nexus S or wait for dual-core phones to reach my country? I’m in a dilemma

    • ScochGuest 4 years ago

      You should wait for quad core phones :)

    • CraigboyGuest 4 years ago

      Get the Nexus

    • yoGuest 4 years ago

      It depends on how soon you need a new phone and what you want it for. If you need to get a new phone today, go with the nexus s. Its easily the best phone on the market today, you will not be disappointed. Its a great phone and will continue to get great support from google, even when dual cores are available. On the other hand, if you want a top end gaming phone and you can wait a few months, I would wait and see what dual core phones are available in your country and how people like those phones before you make your decision. Thats what I’m doing.

      I have an N1, its been a great phone but I am ready for an upgrade (for a better touch screen, more storage, 4G, etc.) I almost bought the NS out of pure impulse. I went to best buy 3 times to play with it and loved it but did not pull the trigger, and I am glad I didn’t. I am hanging onto the N1 for now, I may eventually get the NS but I want to see what phones are announced at MWC and CTIA.

      • JeffroidGuest 4 years ago

        thanks for all the great replies! I do need a new phone soon. Mobile gaming is secondary to me. I use the phone to facebook/twitter and surf net alot though. So I guess I MIGHT get a Nexus S eventually..

  11. JonGuest 4 years ago

    I’ve always thought Androidandme’s niche is being the most careful and thorough with Android stories rather than being the first to press. Every other blog is an echo chamber of its peers; AAM is an independent voice.

  12. DerekGuest 4 years ago

    quote “As we saw with the Samsung Galaxy S and the iPhone 4, the fastest platform does not always get the best games. Samsung’s Galaxy S featured a faster GPU than the iPhone 4, but Apple’s phone was the platform that all the developers targeted”

    I’d have to beg to differ slightly with you. The reason why devs create more games for iphone4 is the SDK is miles ahead of the Google SDK. Also, the OS itself is better than Android. iOS is actually a real OS and the apps are real programs written in Obj C that run on the CPU. Android is just a big fancy java runtime environment built on top of a linux kernel. Everybody knows java is crap as a programming language where performance matters. Thats why games run soooo much better and faster on an iphone4 vs my Captivate which is almost identical hardware. Samsung Humingbird and Apple A4 are darn near identical CPU’s while the Hummingbird has a faster GPU (SGX 540 vs SGX 535) and the Captivate has a lower resolution screen (800×480 vs 960×640) yet games on iphone4 (with a slower GPU and higher screen res) run much, much better than on Captivate. I’ve played gameloft games on iphone4 and compared them to gameloft games on Captivate…..Not even close. iphone4 flies, while some of the games stutter on Captivate.

    Android is going to need all the processing power it can get to make java apps run smoother.

  13. UniqueNateGuest 4 years ago

    What do you think about the rumored bug for your lovely device? Hope it’s not true or as serious as it sounds in your case. http://bit.ly/fYFxrB