Comments Locked

118 Comments

Back to Article

  • mrtanner70 - Wednesday, September 12, 2012 - link

    Almost looks like Exynos.

    Curious if Samsung really wanted to integrate Imagination just for Apple?
  • iwod - Thursday, September 13, 2012 - link

    The common Internet myth about Apple using Exynos based SoC. No Smasung hasn't make the design ever since they used Mali as their Exynos GPU, and Samsung's A15 wont even have a product shipping it this year.
  • vision33r - Thursday, September 13, 2012 - link

    It just shows how ignorant most Android fans or Apple haters are to think that Apple steals from Samsung.

    Maybe all Android fans should look at history before posting such nonsense.

    Samsung is nothing but a big conglomerate that take trendy ideas and mass produce them.
  • adonishong - Thursday, September 13, 2012 - link

    Just get a news from insider source...

    Most Samsung SOC are designed by Broadcom, they mostly rebrand Broadcom chipset...
  • Scannall - Wednesday, September 12, 2012 - link

    Apple designs and lays out their SoC's. Samsung is just the foundry.
  • syxbit - Wednesday, September 12, 2012 - link

    If the A6 really uses an A15, then that's amazing. By comparison, Nvidia and TI both suck big time. Before, Nvidia had a worse GPU than Apple, but higher clocked quad core CPU. Now though, The Tegra line is worse than Apple in everything. And don't get me started on TI. What a complete joke. The best they have is still the OMAP 4470. Pathetic.

    Also, while I agree with the A15 conclusion, but it's still a guess until confirmed. The speed increase alone shouldn't be anything to go on. They could have done that with a 1.6Ghz pair of A9s (like T3). which at 32nm, should be possible.

    It's a good time to be an Apple user. As an Android user myself, I only have A9's (or krait) as an option :(
  • CCRATA - Wednesday, September 12, 2012 - link

    You do realize that Krait is (almost) an A15. In fact it is better than A15's in certain areas (like the fact that it has a 128-bit FPU, instead of 64-bit). Performance wise they will be nearly identical at the same speeds.
  • zorxd - Wednesday, September 12, 2012 - link

    Even if Krait was a little slower, it came out 6 months before the A15 and Apple usually uses very low clock speed. So dual Krait and quad A9 have nothing to be ashamed of.
  • deltatux - Wednesday, September 12, 2012 - link

    Ummm... the A15 has a 128-bit wide NEON FPU engine like the Qualcomm S4 Krait... I think you're thinking of Cortex A8 and A9 which has the 64-bit FPU but can process 128-bit FPU instructions.
  • mschira - Wednesday, September 12, 2012 - link

    Yea but where are the phones using a Krait?
    M.
  • lowlymarine - Wednesday, September 12, 2012 - link

    Are you kidding? MSM8960 (Dual-core Krait @ 1.5GHz) is probably the single most common SoC on the market today. A partial list of phones using it:
    HTC One S (all) and Droid Incredible 4G LTE
    HTC One XL/LTE One X/Evo 4G LTE
    LTE Galaxy S 3
    Nokia Lumia 820 and 920
    Motorola Droid RAZR HD and Atrix HD
    Sony Xperia V
    ZTE V96
  • dishayu - Thursday, September 13, 2012 - link

    Yeah, well, sadly, not all One S are dual core kraits.

    The one that sells in India (and a handful more countries) comes with a dual core scorpion at 1.7 Ghz.
  • Roland00Address - Thursday, September 13, 2012 - link

    Type "ctrl+f" and search for Krait it lists multiple models of krait depending on dual core vs quad core, mhz speed, and what form of baseband modem

    http://en.wikipedia.org/wiki/Snapdragon_(system_on...
  • mrtanner70 - Wednesday, September 12, 2012 - link

    I agree. It could be a maximally clocked A9 at 32nm. I hope its A15 though.
  • 3DoubleD - Wednesday, September 12, 2012 - link

    You might be right, they could just be running higher clocked dual A9s.

    As for the A15s in Android devices, those should be releasing imminently. That said, dual 1.5GHz Krait cores are plenty of power. So much so, I still haven't bothered flashing my SGSIII with CM or some other stock android build. Ya, TouchWiz (or whatever they are calling it now) is ugly, but I'm mostly holding out for stable CM10/JB.

    In the end, I wouldn't be totally surprised if Apple was the first to release A15s though. They are willing to pay for it. They've been paying for their MASSIVE SoCs for the past few generations now. If you are Nvidia, you need to produce chips that will sell in high volume. Most manufacturers aren't willing to buy a chip 4x the size. Apple on the other hand can design this chip, contract out Samsung, and get the volume they need to make workout on the quarterly spreadsheets.

    Only other company that can do this is Samsung themselves.
  • name99 - Wednesday, September 12, 2012 - link

    "Only other company that can do this is Samsung themselves."

    You forget that Apple are IMMENSELY disciplined in how they reuse chips across all their devices, from iPads to iPod Touch's to Apple TVs. This reuse allows them a whole lot of scope for getting use out of chips that are less than perfect (one of two cores doesn't work, power or frequency targets aren't hit, etc).

    Samsung, in THEORY, could do this; but in practice they are as indisciplined as every other CE manufacturer, using a random collection of chips across a constantly mutating product line, with zero thought given to these sorts of holistic issues.

    The only comparable company I see is Intel, which likewise designs its product line for maximal value from somewhat defective devices and maximal reuse of IP.
  • Swamp - Wednesday, September 12, 2012 - link

    you do realize this is a phone. right??

    If it was a tablet, yeah I'd want a fast cpu/gpu. But its just a phone.

    Nothing special...

    Same boring design as the first iphone. Cant even tell them apart.
  • name99 - Wednesday, September 12, 2012 - link

    Amazing how the SAME people who insist that Apple buyers are driven by fashion seem to care a lot more about the appearance of the iPhone than its ACTUAL buyers.

    And if you can't tell the difference between iPhone and iPhone 5 you're an idiot.
  • miskol - Saturday, September 15, 2012 - link

    "Same boring design as the first iphone. Cant even tell them apart."

    woww... iPhone 2G and iPhone 5 have "same boring design"???

    pure idiot. just shutup, and go talk to a wall...
  • lowlymarine - Wednesday, September 12, 2012 - link

    "The best they have is still the OMAP 4470. Pathetic."

    Funny you should say that, because my first thought on Apple's announcement was "huh, that sounds EXACTLY like the OMAP 4470." Think about it: 2x Cortex-A9 at 1.5+ GHz, and an SGX544 MP2 for graphics. That would provide almost exactly twice the current A5's performance in both areas.
  • gunblade - Wednesday, September 12, 2012 - link

    Close in CPU performance but suffer in power consumption.
    Far off in the GPU performance, SGX544 MP2 has about similar per clock performance compared to SGX543MP2.
    OMAP 4470 runs the GPU core at 300Mhz while A5 runs at 250Mhz, so OMAP 4470 is about 30% faster compared to A5 in iPhone 4s configuration and would not come close to 200% faster in A6.
  • Aenean144 - Thursday, September 13, 2012 - link

    Anandtech has already reviewed an OMAP 4470 power device, the Archos 101 XS. It has a single SGX544 core.

    The SGX544 and SGX543 are basically the same computationally per Hz except for the 544 has DX9 extensions.

    Since the Apple A5 has 2 SGX543 cores, that makes the old A5 SoC 2x as powerful as the GPU in the OMAP 4470. Since TI clocks the SGX544MP1 higher than Apple clockes the A5 SGX543MP2, probably on the order of 50% to 70%, they can make up the difference.

    With Apple claiming a 2x increase in GPU over the A5 SoC, that pretty much definitively says Apple is not using an OMAP 4470 as that would have to clock the SGX544MP1 to 700-800 MHz to achieve that goal. Highly unlikely.
  • Lucian Armasu - Thursday, September 13, 2012 - link

    OMAP 4470 runs its GPU at 387 Mhz.
  • TheJian - Wednesday, September 12, 2012 - link

    Yet games still look better on Tegra across the board. I'm more than happy to take performance upgrades but not if I don't see much for it. I'd rather see software that is optimized to take advantage of those technologies. Apple spends nothing on devs. Tegrazone exists to help devs and highlight the money they spend on devs to get better graphics on the Tegra phones. You can have a great chip with tons of performance, but if nobody uses it to enhance my experience what is the point?

    If apple isn't handing out money and Android is activating 4-1 vs apple, it's hard to argue that a dev would spend more money on apple's shrinking % of the market share rather than an exploding android market. If I get no help, I go where the biggest pie is. If apple doesn't stop this soon, android will have a ton more devices in the market and devs will always be developing for Apple 2nd (much like they do in games now, apple is always 2nd and the conversion/port, not the platform games are designed for).

    Apple is making tons of money. No argument there. But devs have to worry about % of users that can buy their app/game. They don't really care about apple's bottom line, they care about their own.
    http://news.cnet.com/8301-1035_3-57510994-94/googl...
    The sheer number of potential customers is staggering and continues to grow on android. If apple doesn't start writing checks to devs soon, they will be in a world of dev hurt, as they head to where the masses are sitting (and Nvidia is tossing money at them exacerbating the issue). IF I was writing a game, and Nvidia offered a check and support team help to make their product shine, I'd take the check. It's money you get no matter if you project sells or not to users. It's free dev money. It's the same thing with Nvidia taking money from Nexus 7/surface etc. They don't care if google or microsoft ever make a dime on them. They've already inked their deal and sold the chips. IF they sit on the shelf NV still got the sale already and profits go up for their stock/bottom line. The same is true with devs who get checks to write better code for NV based apps/games easily downloaded via tegrazone. Hurry up google, get me a 10in tablet based on Wayne :)

    If I was apple I'd spend $1 billion on say 100x10 million games for devs. It would ensure some top quality games for the next 2 years and create a market that currently doesn't really exist on their platform. I'm talking games like Baldur's Gate enhanced quality (possibly Wild Blood, unreal engine types), $10-20 not .99 cents. It may even help take over consoles for them. Heck just port/update all the great x86 games! Most kids have no idea what planescape torment, wizardy, ultima, baldur's gate, etc even are. The games would be ready by the next rev of phones/tablets most likely and hack off a lot of console sales next year as the new expensive hardware debuts.

    The power of cell phones at 1080P is becoming a reality. What do you need a console for if you can get the same performance on a phone via hdmi out to your tv? If I'm tossing up devices at xmas next year, phone vs. new console is a tough argument for parents on a budget. One device to rule them all so to speak. I can reach my kid on the phone/tablet, they can browse the web, play games, use old xbox/ps2/ps3 controllers, do some effective homework etc. A 10in tablet could be output to a larger monitor/keyboard/mouse combo to double as a cheap laptop for the kids (phone can do this too). Develop a website showing parents (via vids?) exactly how to connect it all and viola. Console dead, your phone sells, devs make money on partially already paid for games (nixing NV's advantage currently in dev love affair from years of PC money & continuing with phones/tablets), etc. All good for apple (or whomever does this and fosters it's good health).

    Why didn't ouya etc go with apple based products? No money/help from the company. Apple could have paid to keep ouya alive long enough for them to be viable. Apple could fund this game infusion for a Billion a year for the next 5 years and laugh as they kill consoles. They seem to refuse. With 110 Billion in the bank they should spend on long term future projects. We can ditch their expensive phones/tablets easily today (witness Kindfire sales stealing 17% of tablets). Everyone has the same thing today. The second a good 10in competitor comes out, I won't even be able to remember Ipad's name (surface x86? - all my old pc games instantly run on it). I have no reason to be loyal (or even become loyal as I don't own either an android or ios base device) that I can't see will win in the fun dept. I'll head to the where I think the most/best looking fun will be as every device can wait on me in word/email etc. I never plan to dev on a phone or tablet so games will draw me in. I'm not sure I'll ever want to run REAL apps (photoshop, dreamweaver etc etc) on a puny screen. But I'll want to HDMI out to my big screen tv for some quick fun on the road or in the home on the couch :)

    LOL@ this:
    "It's a good time to be an Apple user. As an Android user myself, I only have A9's (or krait) as an option"

    So the NV/TI chips suck but you bought one based on some android related chip. I call BS. Is this another logon for you Tony Swash? ;) Itanium is a great tech, but no dev cares. The software & experience we get from the device is the important thing to end users. The chip inside? Only us nerds know ;)
  • steven75 - Thursday, September 13, 2012 - link

    This is so much uninformed opinion in this comment that I have to ask if you've ever read Anandtech before?
  • name99 - Thursday, September 13, 2012 - link

    "But devs have to worry about % of users that can buy their app/game. They don't really care about apple's bottom line, they care about their own.
    http://news.cnet.com/8301-1035_3-57510994-94/googl...
    The sheer number of potential customers is staggering and continues to grow on android. If apple doesn't start writing checks to devs soon, they will be in a world of dev hurt, as they head to where the masses are sitting (and Nvidia is tossing money at them exacerbating the issue)."

    So your solution to the (minuscule) developer problem of iOS fragmentation is to switch to the (massive) problem of Android fragmentation?
    Your solution to developers (supposedly) not making money on iOS is to switch to the Android marketplace where piracy is far more common and customers are willing to spend far less?

    Good luck with this economic plan.
  • Dazex - Thursday, September 13, 2012 - link

    A bit long of a comment. I think your intention is good and you got some good points. But sometimes it got convoluted with incorrect facts and personal opinion. I am sure you meant well thought.

    For instance...
    Google shared that in June 2012, they got 400 million Android devices.

    Apple shared that they closed out last qtr, ending June 2012 with 400 million iOS devices.

    We know Android is growing fast. But looking at those two numbers, it isn't as "staggering" as you thought. One thing I do know...there is much wider difference between the lowest Android devices and the most premium than the same difference on the iOS side. And that does count more substantially when deciding which platform to develop for than the sheer numbers a lone.
  • mrtanner70 - Wednesday, September 12, 2012 - link

    its a bit more complicated than that and Samsung clearly said Exynos would be the first A15 SOC on the market. It's not in the note 2, so.......
  • gunblade - Wednesday, September 12, 2012 - link

    TI was also widely said to have the earliest Cortex A15 silicon out. However, the earliest mass produced and shipped standard licensed Cortex A15 based design is the Apple A6. It is clocked at 1.5Ghz and using Rogue graphic core.
  • tipoo - Wednesday, September 12, 2012 - link

    " However, the earliest mass produced and shipped standard licensed Cortex A15 based design is the Apple A6. It is clocked at 1.5Ghz and using Rogue graphic core."

    Citation needed. Rogue was supposed to be 10-20x faster, they could do much simpler things to get to 2x, it could just be the MP4 shrunk down.
  • gunblade - Wednesday, September 12, 2012 - link

    That was a wild guess. Thinking back, there is no way Apple will take chances on both CPU and GPU in a chip that are gonna be this high volume.
    Therefore, I think it is most likely Cortex A15 with tweaked double clock rate(500Mhz) on SGX543MP2 GPU.
  • TheJian - Wednesday, September 12, 2012 - link

    You sound so definitive in your previous post "However, the earliest mass produced and shipped standard licensed Cortex A15 based design is the Apple A6"

    Then it becomes what it is..."a wild guess". Nuff said. At least you threw in the words "I think" in your next one :)
  • gunblade - Thursday, September 13, 2012 - link

    I am pretty sure about the Cortex A15 part, since a few of my ex-coworkers join their SOC validation team a year ago and they were rushing to get silicon working late last year. The Rogue based assumption is the wild guess and tipoo rightfully pointed out that it is very unlikely if the performance only increases two-fold. Given that there are quite a few SGX543/544 based GPU scale to over 532Mhz on the TSMC 28nm process, I think it is quite safe to assume that Apple if needed can get good enough yield to have 500Mhz on their new SOC.
  • nathanddrews - Wednesday, September 12, 2012 - link

    I'm really a fan of the live blogging, great stuff! The pictures, too! Keep it up!

    (I probably won't be buying an iproducts though...) :P
  • snoozemode - Wednesday, September 12, 2012 - link

    If its twich as fast it seems more likely that it is a quad-core cortex a9.
  • MrSpadge - Wednesday, September 12, 2012 - link

    Nah, quad core is mostly useless in a phone (now and for some time to come).
  • TheJian - Wednesday, September 12, 2012 - link

    Quad, Dual, etc...Heck 100 core...Means nothing (just like the chip name), it's about the experience the device gives and how long you get that experience (battery power). The one with both being the best wins. Unfortunately both are kind of all over the place with no clear winner yet (will there ever be?...LOL). I'd prefer to see apple's ipad battery life on my tegra based tablet (wayne?). I really wouldn't mind them slipping in a thicker battery to get it. I don't require paper thin devices screwing me out of battery life to get thin.

    If Apple starts spending money on games they'll win my money early next year or xmas 2013. I'm still waiting on a good (affordable) 10in. Ipad isn't what I'd call affordable (exact opposite - but surface/nexus 10in, kindle hd should bring apple back to reality soon). Nobody can really complain about the experience on any of the latest devices. They may continue to get better (surely), but it won't be leaps and bounds above any other no matter the brand/chip. We're nearing console like performance. Until the "killer app" comes along what do we need more power for over say next xmas (2013) devices? You go to your PC for real power. These are just fart around devices we carry. You don't edit photos etc for 8hrs at work on an ipad/iphone etc. If an employee tries to do that I'll tell them get back on your PC/Mac and get some real work done or you're fired :)
  • Lucian Armasu - Wednesday, September 12, 2012 - link

    Unless Samsung gave Apple their Exynos 5 Dual CPU, which would be pretty shocking, I have a really hard time seeing how this is based on Cortex A15. I still think it's quad core A9, which would also deserve a new A6 name, since A5X was dual core. What would they have named it otherwise? A5X2? Sounds a little silly.
  • gunblade - Wednesday, September 12, 2012 - link

    well, given the diesize area shrink by 22%, it has to be a different core configuration. Coupling with the fact that higher clock speed will exponentially increase the dynamic switching energy, it is very clear that the A6 has to be a Cortex A15 based design in order to reach the 2x performance claim.
  • tipoo - Wednesday, September 12, 2012 - link

    But it's also on a smaller fabrication process, and the CPU cores are actually a minority part of the SoC die. I think it could still be smaller than the old one with four A9s.
  • Lucian Armasu - Wednesday, September 12, 2012 - link

    Of course it's smaller. A5 was made at 45nm. This one would be made at 32nm
  • Exophase - Wednesday, September 12, 2012 - link

    But Cortex-A15s take up more space than Cortex-A9s - I don't know how much, but with all of the changes it has to be substantial. Would not be that surprised if it takes up twice as much space sans L2 cache.
  • name99 - Wednesday, September 12, 2012 - link

    Why do they need Exynos? Why can't Apple can get the core from ARM, add on their various other blocks, and just submit the whole thing to a fab?

    People seem to believe Apple can't have become a "real" fabless SOC house; but we know they have bought a bunch of companies in this space; we know it is something they have long wanted to do; we know they have been leading up to this for the past five years, designing and customizing more and more of what goes into their SOCs.
  • iwod - Thursday, September 13, 2012 - link

    While it is true you could still have 2x CPU performance from a Quad Core A9, but anand commented on the performance increase example aren't multi threaded in nature, which means it has to be either double the frequency or A15 with slightly higher frequency then current A5 SoC.
  • BuddyRich - Wednesday, September 12, 2012 - link

    but the Arm Cortex A15 core was not confirmed, just a speculation on Anand's part? Wonder when we'll get confirmation one way or another.
  • tipoo - Wednesday, September 12, 2012 - link

    Just speculation
  • zorxd - Wednesday, September 12, 2012 - link

    It wouldn't be the first time speculations were wrong about the iPhone. Everybody assumed the 4S had 1GB RAM, since every other iteration doubled the RAM. It turned out it had only 512MB. Also, everybody assumed that the iPhone 4 had a 1GHz processor like the iPad, while it was really only about 800 MHz.
    Apple only said that the CPU was twice as fast. So it could be a dual A9 clocked at 1.6GHz, a dual A15 at unknown frequency (probably arround 1.2 GHz), or a quad A9 clocked as low as 800 MHz. We will see.
  • name99 - Wednesday, September 12, 2012 - link

    I think it's worth looking at the tasks they showed as sped up. They're all IO related. I suspect PART of what is going on here is flash that is 2x as fast. (And, last time I checked, Apple's current flash is about 2x as fast as the flash used in competitor high-end phones.) So that's part of it, and that's the kind of thing you can use to show 2x speedups even if your per-thread performance isn't 2x.

    But they do specifically say "2x faster" CPU, whatever that means...
    An alternative possibility to A15 which I raise simply for completeness is that they may have dusted off their old macro-scalar patent and applied that to the A9. My interpretation of macro-scalar was that it was a form of ISA-visible hyper threading, and you could argue that well-done plus say a 50% clock boost gets you to 2x speedup (at least for certain types of well-parallelized code) in the available area.

    Against that argument, we have the problem that SMT works best as part of the full superscalar OoO setup; trying to graft it on to the Pentium-style paired dispatch of the A9 is going to give some boost (mispredicted branches, cache misses) but not that much. On the third hand, you have to start somewhere, and maybe Apple fits it into the A6 with an A9 core as a trial run for when they REALLY care about using it, in the A15, when they can give 4 virtual cores in 2 core area and power?
  • tipoo - Wednesday, September 12, 2012 - link

    Could they not have just crammed double the old A9 cores in there, and still had it smaller since it's on a smaller fab process?
  • tipoo - Wednesday, September 12, 2012 - link

    Or even upped the clock speed plus other enhancements. No way of saying right now that it's A15 for sure.
  • ufon68 - Wednesday, September 12, 2012 - link

    To reach the 2x speed increase, the frequency would have to go substantially higher than it currently is, which would probably cause higher power consumption, or about the same at best , even when considering die shrink due to smaller manufacturing process. It doesn't add up.
  • ufon68 - Wednesday, September 12, 2012 - link

    Simply put, i don't think you can increase the speed 2x and lower the power consumption at the same time through die-shrink alone.
  • tipoo - Wednesday, September 12, 2012 - link

    The 32nm A5 in the shipping iPad 2 consumes a significantly lower amount of power. And the extra cores would be power gated.
  • Lucian Armasu - Wednesday, September 12, 2012 - link

    It's not like it would be the first time Apple overhyped something. Remember when they were saying A5's graphics was 9x times bigger than Tegra 2. Sure, that was true in some very specific and light tests, but overall it was only like 2x better or something.

    The "2x increase in performance" claim could come from being a quad core, too. That's how Nvidia markets its Tegra 3, too, pretty much. Don't read too much into it.
  • Exophase - Wednesday, September 12, 2012 - link

    They said it was 9x faster than A4 (SGX535) which was a fairly valid assessment. They didn't say anything about Tegra 2.

    The 2x claims are backed up by at least some software figures so I don't think they're just playing fake marketing numbers on this one. That doesn't rule out quad core though, I could easily see both quad core and higher clocks.
  • 1008anan - Wednesday, September 12, 2012 - link

    Brain and Anand, what do we know about A6 specs?
    --can we confirm that they are all fabricated at 28 nm by Samsung?
    --I think we can confirm that there are two A-15 Cortex cores?
    --Is there a lower performance low power companion core that performs functions while the two A-15 cores are in sleep mode? [Similar to the two M-4 Cortex cores in the OMAP 5]
    --What do we know about graphics (are the rumors of rogue correct)?
    --single precision gigaflops?
    --triangles per second?
    --max wattage TDP?
    --idle power wattage?
    --maximum megapixel frontside and backside camera supported by SoC?
    --LTE baseband wattage TDP?
    --wi-fi wattage TDP? (Brian, woo-hoo, real wi-fi at last!!!)
    --other SoC performance metrics?

    Apple has matched TXN as one of the first two venders to bring A-15 Cortex to market. However, Apple is the first to introduce A-15 in huge volume. More than 30 million before this year is out (maybe more than 40 million). Many millions this month. Amazing.
  • tipoo - Wednesday, September 12, 2012 - link

    Pretty sure it is 32nm, not 28, as they were ramping up 32nm production with the ATV and iPad 2.
  • menting - Wednesday, September 12, 2012 - link

    So 2X faster means 3x the speed right?
    Because if it's not the case, for example, 50% faster will mean half the speed only.
    Or are they misleading on purpose?
  • mavere - Wednesday, September 12, 2012 - link

    I think your mathematics got stuck somewhere in elementary school.
  • menting - Wednesday, September 12, 2012 - link

    how so?
    0.1X faster = 1.1X the original speed.
    0.5X faster = 1.5X the original speed.
    1X faster = 2X the original speed.
    2X faster = 3X the original speed.

    So I'm just trying to get clarification that when Apple says 2X faster in their slide, they actually mean 3X the original speed, and not just 2X the original speed, which will be "1X faster"
  • tipoo - Wednesday, September 12, 2012 - link

    No, just look at Apples history to know what they mean. When they say 2x faster they mean 200% as fast.
  • menting - Wednesday, September 12, 2012 - link

    wow..what a history of misleading then. Their engineers must hate their PR department.

    if one day they say " it is 100% faster!!!", then it really means it's not any faster.
  • Lucian Armasu - Wednesday, September 12, 2012 - link

    That's not how you do it. No one says "1x faster". That expression just doesn't exist - in any language. It's "2x faster".

    I think you're confusing it with saying "the old chip has HALF its performance, therefore the new one is 100% faster, or 2x faster". I don't really know what you meant, but what you're saying makes no sense.
  • lukarak - Wednesday, September 12, 2012 - link

    Actually, you should say two times as fast, not two times faster.
  • mavere - Wednesday, September 12, 2012 - link

    But if the number is 4 or 5, saying "4 times faster" or "5 times faster" would be well understood is day-to-day language.

    Using "twice" is more a matter of being succinct.
  • mavere - Wednesday, September 12, 2012 - link

    in*
  • menting - Wednesday, September 12, 2012 - link

    it's not. Definitely not in high school or higher level mathematics or engineering related work.
  • mavere - Thursday, September 13, 2012 - link

    I can walk into any engineering course at my alma mater or my local high school and talk about things being "two times faster" or "three times faster" and every one would be exactly on the same page.
  • menting - Wednesday, September 12, 2012 - link

    exactly
  • menting - Wednesday, September 12, 2012 - link

    why would you think "1X faster" does not exist? Does "100% faster" not exist? 100% is exactly 1.0, which is 1X.

    if the old chip has HALF its performance, the new one would be "100% faster", or "2X as fast", definitely not 2X faster.

    I think you're getting yourself confused with the definition
  • Lucian Armasu - Thursday, September 13, 2012 - link

    Am I?

    Let's use these examples:

    A6 is 50% faster than A5 - 1.5x faster
    A6 is 80% faster than A5 - 1.8x faster
    A6 is 90% faster than A5 - 1.9x faster

    A6 is 100% faster than A5 - 2x faster

    Now tell me how I'm wrong. I think you must be confusing the expression "80% AS fast" with "80% FASTER". They mean completely different things.

    A6 being 80% as fast as A5 = 0.8x faster (or 20% slower)

    A6 being 80% faster than A5 = 1.8x faster

    Expressions like "100% AS fast or 1x faster" are never used in real life, only in very few math problems, and they both mean "equally as fast". But "100% FASTER" means twice as fast.
  • mavere - Wednesday, September 12, 2012 - link

    You're twisting a common English phrasing.

    I realize that English may not be everyone's first language, but I'm not sure how to respond, as the premise of our argument is absurd.
  • mavere - Wednesday, September 12, 2012 - link

    your*
  • lukarak - Wednesday, September 12, 2012 - link

    Is it now?
  • silverblue - Thursday, September 13, 2012 - link

    Definitely correct the first time.
  • MykeM - Thursday, September 13, 2012 - link

    I believe he's correcting the typo on "our argument is absurd" which instead should've read "your argument is absurd". It's not a correction for "You're twisting...".
  • silverblue - Monday, September 17, 2012 - link

    Good catch.
  • lilmoe - Wednesday, September 12, 2012 - link

    50% "faster" = 1.5x "as fast" (one and a half times as fast)
    100% "faster" = 2x "as fast" (two times as fast)

    You're right in your assumption. It's a common mistake.
  • iwod - Thursday, September 13, 2012 - link

    Ok, so is it a common mistake? Or is the English Phase actually mean that way? Or has the common mistake become so common we just put that mistake into actual correct meaning?

    I once though that as well. 100% faster means 1 ( its original speed ) + 1 ( 100% or its original speed ) which is twice as fast. or 2x the Speed of our Previous SoC or Double the Performance.
  • nerd1 - Wednesday, September 12, 2012 - link

    Iphone 4S drove its 2 X A9 core at 800mhz AFAIK, and it won't be too hard to drive TWO A9 CORES at 1.5Ghz (so 2x performance gain) without spending too much power. GS3 and GN2 already uses FOUR a9 cores at 1.4~1.6Ghz with decent battery life, so I think this is better conjecture than world-first A15 cores.

    And if so, the naming is more likely due to the integrated LTE radio, not a15 cores.
  • NCM - Wednesday, September 12, 2012 - link

    If there's one thing we know for sure about the A6, it's that we won't know what an A6 actually is until someone does surgery an iPhone 5 to either cut open or X-ray the SOC. And Apple isn't telling. (OK, was that two things? <g>)

    Until then we're just playing with ourselves.
  • syxbit - Wednesday, September 12, 2012 - link

    there is no integrated LTE radio. It's an external qualcomm chip
  • kpb321 - Wednesday, September 12, 2012 - link

    Apple hasn't shown any past inclination to be the first with something especially when it comes to the processor in the iPhone. There are plenty of other simpler ways that they could claim twice the performance of their previous cpu.

    1) A customized tweaked version of the A9 core similar to what Qualcomm has done with the Krait cores in the S4's.

    2) Go Quad core over Dual core. Plenty of Quad core A9 implementations and it is not too unreasonable to claim a phone with twice the cores is twice as fast especially if they have worked on how well iOS 6 handles multithreading.

    3) Crank up the MHz. As others have posted the iPhone 4s was running at 800mhz. A 1.6 ghz A9 dual core processor is certainly possible.

    4) You could even combine 1 and 3 to have a tweaked design that was faster per clock and ran at a higher clock speed to get the 2x performance. Off hand this would be my guess. 50% improvement from tweaked design, 50% from clocking it up to 1.2 ghz.
  • Moizy - Wednesday, September 12, 2012 - link

    I doubted at first, but I think Anand's right. Besides, he's on twitter right now hinting he knows more than he can tell, but what he knows confirms we're looking at Cortex A15s.

    If Apple used A9s and 543 graphics, then they could have chosen to pursue 2-3 of the 4 accomplishments they claimed today: 2x compute, 2x graphics, 22% die shrink, and maintained to slightly improved battery life. With A9s and 543 graphics, even at 32nm, it would be impossible to achieve all four of those things together.

    With the GS3, Samsung achieved 3 of those things (roughly speaking, 2x compute, 2x graphics, and maintained power efficiency) over the GS2, but does anyone know the die size of the Quad Exynos vs the Dual? I doubt the Quad is 22% smaller.

    Thanks Anand and Brian for the coverage, and I've enjoyed the podcasts, keep up the stellar work.
  • tipoo - Wednesday, September 12, 2012 - link

    " but does anyone know the die size of the Quad Exynos vs the Dual?"

    But the quad and dual Exynos are both on the same fab process, while this went from 45 to 32. But I do think it's A15 since Anand seems eager to tell us outright on twitter.
  • Moizy - Wednesday, September 12, 2012 - link

    Exynos Quad was a 45 to 32 die shrink as well, thus it was able to increase performance without hammering battery life:

    http://www.anandtech.com/show/5786/exynos-4-quad-1...
  • mrtanner70 - Wednesday, September 12, 2012 - link

    If it was quad core it would have been a marketing bullet. Apple does not like to focus on specifics but they would not hide the dominant marketing phrase in Europe for high end phones.

    So, it's dual.

    It is thus a much higher clocked, die shrunk, tweaked version of the A5 (so Arm A9) or its the A15. Given we know Samsung has started to mass produce dual A15's it's a bit hard for me to believe Apple would go another year on A9. Maybe the timing was perfect for them to be first?

    The GPU is even more interesting.
  • Moizy - Wednesday, September 12, 2012 - link

    GPU is most interesting, agreed. Benchmarks! Benchmarks! Details! Details!
  • tipoo - Wednesday, September 12, 2012 - link

    With 2x the graphics power a shrunk down 543MP4 would easily fit the bill, or maybe a higher clocked MP2.
  • Moizy - Wednesday, September 12, 2012 - link

    543MP4 would give 2x performance, but would not allow, even with 32nm, for a 22% overall die shrink. So a higher clocked 543MP2 would maybe work. My money is on Rogue, but maybe tackling Rogue and Cortex A15s at the same time would be too much, and we'll see Rogue with the next iPad.
  • tipoo - Thursday, September 13, 2012 - link

    Rogue was supposed to have much bigger performance gains than this though, plus I don't think it's finalized yet.
  • lilmoe - Wednesday, September 12, 2012 - link

    Are you declaring the A6 as Cortex A15 because you know? Because if it's just an assumption, you make it seem as if it's real... Not very professional.

    If this is a dual core 800MHz-1GHZ cortex-A15, then dual core Krait @ 1.5GHz should be faster. I'd say the dual-core Snapdragon S4 Pro might be a better chip... We'll have to wait and see.
  • tipoo - Wednesday, September 12, 2012 - link

    Even with only ~800MHz dual A9s the 4S was pretty competitive with much faster phones in benchmarks, only phones with huge advantages (Krait, Tegra 3 to an extent) pulled forward. If it's A15 with a similar clock rate, that's still good progress, and the clock rate saves battery life.
  • lilmoe - Wednesday, September 12, 2012 - link

    For the graphics part, yes, for the rest? no.

    and no, sunspider isn't a measurement of the overall speed of the browser...
  • darkcrayon - Wednesday, September 12, 2012 - link

    Sunspoder doesn't have to be a measurement of the overall speed of the browser. All you have to do is swipe or double tap to zoom on most of them to see how choppy most of them perform compared to the iPhone. But that's more of a software than hardware comparison.
  • tipoo - Thursday, September 13, 2012 - link

    Sunspider, peacekeeper, browsermark, even IE9 test drive (well I guess that one is GPU), it always does better than you would expect from a dual 800MHz A9.
  • zorxd - Thursday, September 13, 2012 - link

    These are all browser benchmarks, not CPU benchmarks. In real CPU benchmarks, the Apple A5 perform exactly as it should, at 2/3 the speed of higher clocked Cortex A9.
  • g1011999 - Thursday, September 13, 2012 - link

    XCode 4.5 will now default build binary with armv7/armv7s architecture which means they have a new cpu core other than Cortex A9.

    armv7 ==>Cortex A8, Cortex A9
    armv7f ==>Cortex A9MP
    armv7s ==> Cortex A15.
  • stfual - Wednesday, September 12, 2012 - link

    I think if it was quad core Apple would have said so. They only talk about tech specs when their numbers are bigger (like screen resolution). It clearly works - if you look in Google news today its surprising how many outlets have as their headline iPhone 5 - 2X performance with no technical specs to support it.. Apple is the best marketing company in the world.
  • surgexx - Wednesday, September 12, 2012 - link

    Sorry Anand, you're wrong.
    The "lightning" connector is USB 2.0 still (proof: http://store.apple.com/us/product/MD818ZM/A/lightn...

    "This USB 2.0 cable connects your iPhone or iPod with Lightning connector to your computer's USB port for syncing and charging or to the Apple USB Power Adapter for convenient charging from a wall outlet"

    A15 arch. is USB3 on the HOST side :)
  • gunblade - Wednesday, September 12, 2012 - link

    The A15 is simply a CPU core and it has nothing to do with what is hanging on the AXI bus. USB3 IP is quite common and if Apple desired can easily license IP from multiple vendors (Synopsis and etc). I think it is more of that the extra serdes power consumption is not justified in this case for the speed increment.
  • surgexx - Thursday, September 13, 2012 - link

    Right, but if it's available and all the new macs have USB3, why not use it on the SoC?
  • surgexx - Thursday, September 13, 2012 - link

    Especially when Samsung Exynos 5 *HAS* USB3 on the host side....I just don't think Anand is right here -- either that or he knows for sure and can't tell his source (which sucks for us), but I just don't think he's right.
  • tipoo - Wednesday, September 12, 2012 - link

    I guess he knows it's A15, just can't tell us yet to protect the source. I'll take his word for it then. That's good, more useful than four A9 cores.
  • Fx1 - Thursday, September 13, 2012 - link

    Your title sounds like you are certain but your comments sound like you know jack shit.

    Your blog was the only one in the world to claim A15 cores.

    If your wrong then your basically responsible for the biggest misinformation of the A6 in the entire world.

    I find this irresponsible.
  • huh!! - Thursday, September 13, 2012 - link

    You do know this is a blog entry right?
  • Lucian Armasu - Thursday, September 13, 2012 - link

    I have to agree. Half of the people reading not only this post, but all the other posts that others made and said the same thing because "Anandtech said it" (Proof: http://www.extremetech.com/mobile/136085-whats-ins... will think it has A15 CPU's even AFTER Anandtech and everyone else finds out it's actually still something based on A9, if they are indeed wrong on their assumption.

    A lot of people will miss the info update, and will be buying the iPhone thinking it has A15's.
  • tipoo - Thursday, September 13, 2012 - link

    Something tells me Anand is more sure than he can say, but we'll see.
  • zorxd - Thursday, September 13, 2012 - link

    It's a guess. Even if he happens to be right, it's still a guess as of today.
  • steven75 - Thursday, September 13, 2012 - link

    From your comments in this thread it sounds more to me like you actively want it to be untrue because it will clearly be better than what's shipping in current Android phones, and Android fans DO NOT like is to lose in spec wars and checklist bulletpoint battles. (Even though, um, rational people agree the only thing that matters is the end result.)
  • thakathinka - Thursday, September 13, 2012 - link

    Agree. This is highly misleading. Anand's comment is being taken verbatim to other sites. There's no data suggesting this is an A15 at all.

    I'd consider this to be at 28nm for a 25% shrink in die (roughly). Making it a quad A9 and other improvements make this a 22% die shrink.

    Check out:
    http://en.wikipedia.org/wiki/Talk:Apple_A6
  • AngelOfTheAbyss - Thursday, September 13, 2012 - link

    If Apple still uses Samsung as a foundry, the process is most likely 32nm HKMG (down from 45nm in 4S). The A5X (new iPad) with two A9's and an SGX534MP4 on 45nm was a staggering 165mm2 but on 32nm it would be roughly half that. Replacing the A9 cores with A15 cores would increase the die size to something like 100mm2, which fits the bill since the A5 comes in at 122.2mm2. The comparisons in the presentation are between the A5 (4S) and A6 (5), not between A5X (new iPad) and A6. Should the process be 28nm it would imply they have gone to TSMC (or less likely GloFo). (Putting on my tin foil hat now.) All of a sudden we have an explanation why it's so hard to come by an NVIDIA card... (hat off) Then again, does it really matter what secret sauce they have added as long as they meet their claim of 'twice the speed'?
  • sunnyfpy - Friday, September 14, 2012 - link

    IP5 launch is non event.Outdated design,no removable battery,no microsd slot,no radio,small screen.
  • Commodus - Friday, September 14, 2012 - link

    How's it an outdated design? Changing cosmetics for the sake of saying it looks different is vain. I thought the stereotype was that Apple fans were too superficial. Guess it applies to Android fans, too.

    Not having a removable battery or microSD slot has served Apple just dandily for five years, and normal buyers don't balk at those as limitations. FM radio? Really? What era do we live in again?

    And some of us enjoy using our smartphones with one hand (this is coming from someone who has a Galaxy Nexus, even).
  • blakespot - Friday, September 14, 2012 - link

    Come on... It's a significant step down the road of evolution. You want a screen as large as that cumbersome Galaxy Note?? And ... radio??

    Here's a nice little post for you...
    http://www.macworld.com/article/1168591/iphone_5_a...
  • aegisofrime - Friday, September 14, 2012 - link

    Was looking at pdadb.net, here's their listing for the A6:

    2x ARM Cortex-A15 Harvard Superscalar processor core, 64/32-bit Multi-layer AHB/AXI bus, ARM TrustZone, ARM NEON SIMD engine, LPDDR2 SDRAM interface, NAND flash, moviNAND, SATA, eMMC interface, embedded GPS module, HDMI, triple display controller, 1080p video encode, 1080p video decode, audio subsystem, dual-core PowerVR SGX6200MP2 GPU

    It seems almost too good to be true. Cortex A-15 and SGX6200? (AKA Rogue?) It will be a real beast if it turns out to be true.
  • note235 - Friday, September 14, 2012 - link

    if that is true im going to...still get one haha
  • jconan - Saturday, September 15, 2012 - link

    Even before the court hearings, Apple and TSMC announced that they were working on the A6. I suspect the A6 could be from the fruits of Apple and TSMC.

Log in

Don't have an account? Sign up now