don't click here

Thinking about getting a new CPU and motherboard.

Discussion in 'Technical Discussion' started by Hez, May 22, 2012.

  1. Conan Kudo

    Conan Kudo

    「真実はいつも一つ!」工藤新一 Member
    478
    1
    18
    Hold on people! AMD may be gunning for the mobile CPU side, but on the desktop side, AMD is still working on it!

    It's just that AMD is no longer working on dedicated CPUs anymore. The big focus now is on AMD's APU platform. They call it AMD FUSION (or AMD VISION), and it combines the Radeon GPU with the Phenom CPU onto a single chip that can be upgraded.

    It's not fair to compare Sandy Bridge to Phenoms because they aren't equivalent. To compare perf-per-watt, you should be comparing the AMD Fusion APUs with Sandy Bridge. And in that space, AMD beats Intel. The raw GPU power of the Radeon platform way outperforms the Intel GMA platform.

    AMD A8-3850 (Quad core 2.9GHz with Radeon HD 6550D), $110: http://www.newegg.co...N82E16819103942
    Gigabyte GA-A75M-S2V FM1 motherboard, $65 after MIR: http://www.newegg.co...N82E16813128516

    Compare that with the equivalent:

    Intel Core i5-2300 (Quad core 2.8GHz with Intel GMA 2000), $180: http://www.newegg.co...N82E16819115076
    Gigabyte GA-B75M-D3V LGA 1155 motherboard, $70: http://www.newegg.co...N82E16813128540

    Sure, Intel has more powerful CPU+GPU combos with Sandy Bridge, but they also a lot more expensive. The cheaper ones don't measure up performance wise to AMD's offerings either. And AMD has an ace in the hole, too: CrossFireX an AMD APU with dedicated Radeon HD GPU cards. You can never do that with Intel Sandy Bridge APUs. That means you can get triple performance instead of double performance, since you can combine two dedicated GPU cards with the GPU on the APU.

    More and more high class programs are running code on the GPU using a GPGPU platform like OpenCL. This is done for codecs, for A/V processing, and for intense mathematical computation. Having a quality APU and GPU combination will mean more when you take those into account.

    FYI: For those who want to be able to overclock, AMD has the AMD A8-3870K available for $120: http://www.newegg.com/Product/Product.aspx?Item=N82E16819106001
     
  2. HeartAttack

    HeartAttack

    is a smug hipster, brah! Member
    567
    0
    0
    Cali
    You are absolutely right in this regard. In a sense. Grasping at straws, I'd say one big thing that should be taken into account is overclockability. Most of these Intel chips are locked as to where AMD always has a "Black Edition" with an unlocked multiplier, tweakable to your hearts content. Most of them overclock extremely well, too, on air alone. Like I said before - core for core and clock for clock, Intel wins - but overclock the Phenom II and the Intel advantage becomes little to none.

    Now, more importantly - the benchmarks between the two chips are hardly apples to apples. For one, you'll notice in the notes underneath each chip that the Intel chip had "turbo enabled" which automatically overclocks when stressed. The AMD chip is noted to not have any overclocking performed in any way. Also, the Intel chip was tested in a 64-bit version of Windows Vista as well as Windows7 x64 (which has vastly improved prefetching and scheduling features over Vista), and the Phenom II X2 was tested only in the 32-bit version of Vista and not in 7.

    Quite simply: the benchmarks you've given are absolute garbage until we level the playing field a bit.
     
  3. AamirM

    AamirM

    Tech Member
    There is not much truth into that either. Intel is not trying to compete in the APU on desktop market with AMD to begin with. Secondly, your comparison isn't fair. Here is a similarly priced Intel combo for desktop that will eat your AMD APU combo for lunch...

    Intel Pentium G620 Sandy Bridge $70
    BIOSTAR H61MGC LGA 1155 $50
    MSI Radeon HD6670 $75

    As I said, it is hard to recommend any AMD product these days. It loses on every front to Intel.
     
  4. AamirM

    AamirM

    Tech Member
    Are you serious? Ok then, here is the first search result I did of Core i3 2100 benchmarks. Though it doesn't compare 2100 to the Phenom II X2, it compares it against a QUAD CORE Phenom II X4 955 because...well, even THAT lost to this dual core processor. And this time Windows and all things were same. You're just trying to deny the facts here. Fanboy much?

    If anyone still doesn't believe me, go google yourself because I am tired now. :P

    EDIT: Oh, and as for your overclocking argument, instead of buying $30 cooler for overclocking your AMD, you can spend that $30 to get a higher factory clocked Intel processor so now your overclocked AMD loses again (and that without voiding your warranty and lower power AND potentially lower noise).
     
  5. Aerosol

    Aerosol

    Not here. Moderator
    11,163
    573
    93
    Not where I want to be.
    Sonic (?): Coming summer of 2055...?
    Hmm. Those AMD APUs look interesting. They seem to be a bit behind when it comes to gaming performance though, but it's difficult to find benchmarks for them with setups that take advantage of Fusion's benefits. Atleast, if I'm right in saying that you won't really see performance gains in Fusion system until you start crossfiring 2 Radeon's with one of these APUs?

    What I'm trying to say is that most things aren't really optimized for an APU like this so it's hard to recommend one for a gaming build, right now.

    EDIT: And what happens if Intel and Nvidia decide to jump into bed together? :v:
     
  6. HeartAttack

    HeartAttack

    is a smug hipster, brah! Member
    567
    0
    0
    Cali
    I won't deny the facts at all - I even said you were right in that regard.

    While Bulldozer is nothing amazing - the chips are dirt cheap and improve upon the Phenom chips - although they still don't beat out Intel's current offerings clock-for-clock, but do perform very well in highly threaded applications.

    Also, notice I highlighted the words "this time" in your post above. I wasn't denying facts with the benchmark that YOU posted. That YOU linked to. The fact is that the benchmark comparison YOU linked to the first time was garbage, pure and simple ;)
     
  7. It's looking like from potentially leaked information that Piledriver won't be amazing either. =\
     
  8. Thousand Pancake

    Thousand Pancake

    Being a food you put milk on and then eat in the m Member
    360
    0
    0
    No viable alternatives in X86 land make Thousand Pancake a sad panda. :(

    When do you think ARM will become viable enough?
     
  9. Overlord

    Overlord

    Now playable in Smash Bros Ultimate Moderator
    19,239
    972
    93
    Long-term happiness
    ARM will become viable as a main desktop platform about the same time people stop caring about x86 compatibility.

    IE never =P
     
  10. Sik

    Sik

    Sik is pronounced as "seek", not as "sick". Tech Member
    6,718
    1
    0
    being an asshole =P
    Unless Microsoft somehow manages to convince people that Metro is the best thing ever made so they drop their old stuff... (doubtful, but you can never tell, especially not with how confusing Microsoft's marketing is being lately)
     
  11. Conan Kudo

    Conan Kudo

    「真実はいつも一つ!」工藤新一 Member
    478
    1
    18
    Well, it isn't like it isn't possible to dynamically recompile CISC x86 instructions to RISC ARM instructions on the fly. QEMU's emulation technology works by doing that. A more specialized form could be applied to Windows if Microsoft wanted to preserve Win32/Win64. Apple was rather famous for using such a technology for the PowerPC to x86 transition. It also already exists in Android and desktop Linux. But the truth is, Microsoft wants to drop the legacy APIs because they want to change the underpinnings of Windows yet again. The Midori project should be nearly ready for commercialization midway through Windows 8's lifecycle, which is just in time to consider switching out the internals of Windows for Windows 9. Microsoft wants to make the transition to a fully managed environment for its operating systems. Even the XBOX 720 is being prepped for a fully managed environment. Native code programming is a dying art in Microsoft land.


    Well, the APU itself is quite powerful. It's a Radeon HD 6000 series GPU. It's miles better than the GPU included on Sandy Bridge and Ivy Bridge. The major benefit is that if you install a dedicated Radeon HD card as well, you can CrossFire it to get far better performance. While I personally think anything more than two GPUs is overkill, I've seen people do 11 GPU CrossFireX combos (CrossFire with five Radeon HD 6870 X2 cards and an AMD APU). Supercomputer performance for under $2000 is pretty awesome, but obviously overkill.

    Because of the way that AMD structures CrossFire, pretty much all games can take advantage of it, and several non-games can too. But if you don't need CrossFire, the APU's GPU is powerful enough that you can leave the PCI Express slot open for future expansion.

    As for an Intel+NVIDIA tie-up? Not likely to happen. Aside from the fact that the U.S. anti-competition laws would kick in and cause the Department of Justice to jump down Intel's throat for this, NVIDIA and Intel are actually on the outs. They have been nominal partners for the last few years, since NVIDIA decided it wanted to develop its own CPUs. Briefly, they developed designs for x86 CPUs, but then they jumped headfirst into ARM with Tegra. In any case, it took some time before Intel approved NVIDIA northbridge chips for Intel motherboards after the latest socket change. The relationship between the two companies is quite strained. The likelihood of Intel buying out NVIDIA is low, and if such a thing came to be, the U.S. government would immediately block it since Intel already has a graphics platform to work with. They don't actually need NVIDIA for anything. AMD bought ATI in order to create APUs in order to solve the IGP problem and to compete with Intel on the integrated system platform front. The fact is, Intel does not care about its graphics platform, which is why it suffers so much compared to all of its competitors in the graphics field.
     
  12. Hez

    Hez

    Oldbie
    so I have a PNY GeForce GTX 560 video card.....aaaand I just decided to buy another one and SLI that bitch. Any objections or helpful tips?
     
  13. Aerosol

    Aerosol

    Not here. Moderator
    11,163
    573
    93
    Not where I want to be.
    Sonic (?): Coming summer of 2055...?
    All I know is that SLI is not as scalable as CrossFireX. That is to say, two Nvidia cards in SLI will net you less of a performance gain than two equivalent AMD cards in CrossFireX. Besides that, I don't see any reason for you not to.
     
  14. Hez

    Hez

    Oldbie
    I've heard that. I haven't heard much negative things about either though besides CrossFireX is a bit better.

    I also think I am going to stick with AMD because of the price. It's just not worth paying 4X as much for only a slight better performance.