HAWKEN servers are up and our latest minor update is live!
Forgot Password_ SUPPORT REDEEM CODE

Jump to content


Ways to enhance framerate_ Having issues.


  • Please log in to reply
51 replies to this topic

#41 DarkPulse

DarkPulse

    Ghost Liner

  • Members
  • PipPipPip
  • 2,243 posts
  • LocationBuffalo, NY, USA

Posted November 13 2012 - 11:19 AM

View PostOneMoar, on November 13 2012 - 08:45 AM, said:

the APU will run this game just fine(based on the curve I have seen in the benches) are you gonna have 100FPS nope but it will play around 50Fps
you don't need a 2500k to run UT3 its got `enough` graphics gunt to run it and the AMD quad cores are by no means ~slow are they as fast as a Intel chip nope not by mile are they more then enough for pretty much any game/cpu/gpu combo on the planet yep ....
50 FPS is not enough. 50 FPS represents an average, but that means that when dips hit it, they'll pull it closer to the basement of 30 FPS, and generally, most gamers like avoiding that.

Also, since it seems you missed that part, that was not on a 2600K, that was on a Core 2 Duo E6750. Your best at that time was a first-gen Phenom (not a Phenom II), which did worse, and obviously had no such thing as an APU for you to crutch on.

The Phenom II brought things up roughly to Core 2 Duo or Core 2 Quad levels of performance. Then Intel came out with the Nehalem i7s; it's been all Intel since then. It's pretty bad when it took the recent FX-8350 to finally gain performance over the long-time AMD best game performer... a Phenom II X4 965, released in August (140W TDP!) or November (125W TDP) of 2009. So 3, 3 1/2 years to finally gain performance in games... yeah, AMD isn't a choice for gaming, APU or not.

Lastly, guess which CPUs are having problems running Planetside 2, a game notorious for being both a CPU and a GPU hog_ That's right, AMD processors! Some people there are lucky to get 30 FPS even when paired with a real videocard, and that Fusion you're touting wouldn't have a hope in bunny hell of running that game even on low with satisfying performance. Meanwhile, some people on there took our advice to swap out their AMD processors for an Intel, and they got IMMEDIATE 20-30 FPS boosts!

There's a saying I like to use, "Speed costs money. How fast do you want to go_" You pay $350, you get $350 performance. It'll run, but it won't run smooth and consistently, with lots of frame spikes, and heavier action quickly bogging it and stuffing it.

On the other hand, you pay $500 or so for a good Intel CPU (i5-3550, $209.99) and a good videocard (GeForce 660 Ti, $309.99) and I can guarantee you that any gamer would be pretty damn happy with their performance. To me, that's worth the extra $150, easy. Maybe it's not to you - that's fine, but I'm not going to settle for half-baked performance based on a good IGP being stuck inside a ho-hum processor, and I'm going to do my best to point out that for a little extra money, they'll get considerably more performance. Some will go for it, some won't. That's their choice, but presenting an A10-5800K as the "be-all, end-all, cheap solution for your gaming needs" isn't going to fly, precisely because it's not a be-all, end-all, cheap solution for your gaming needs. It's certainly cheap, but that's about the only thing it is.

I don't want to act like an Intel Fanboy, and indeed, I have owned two AMD systems before. I had an original T-Bred 1 GHz, and I upgraded from that to an Athlon XP 3000+ back in 2004. This is back when AMD was giving far better performance for the money compared to Intel, making less heat and making their CPUs run games very competently. However, times have changed, Intel refocused, and while they could manage to keep up okay through the Core 2 line, since the i-series, for gaming value, it's been an Intel. You pay a bit more up front, but you also get a lot more mileage out of it, and never have to worry "will it run games in a year or two_" You know it will.

I go for the bang-for-buck metric. The AMD is cheaper, but for a little more money, the Intel CPU gives you way more bang for your buck. Therefore, of the two, it's the better choice even though it costs more.

View PostOneMoar, on November 13 2012 - 08:45 AM, said:

you can't argue  FPS vrs 125FPS when the 100FPS machine cost ~200 less then the 125FPS machine
now setting the APU aside for ~450 you could have a 6 "Thread" cpu and a middle of the road gpu and beable to play pretty much anything at 60FPS or better
Most games don't use more than four threads, so those hex-core or octo-core AMD CPUs are a literal waste of money unless you do heavy 3D Modeling, movie encoding, or so on. If you're just gaming, you don't need more than four, bar none. So those are right out.

Again, $500 will get you a good Intel CPU, and a good nVidia GPU. They're not the fastest in the line but they're more than enough. Obviously, this doesn't factor in other things (RAM, Mobo, case, PSU, etc.) but if all you need to do is replace CPU, Mobo, and Case, you can do it for about the same price, and get way more performance. If you need to upgrade the whole system, then it's more in the $750 range, but again, the performance gains are worth the extra money, I feel, as is the peace of mind in knowing that it will continue to play games for several years.

My CPU is already about 18 months old. I don't plan to upgrade until at least Broadwell in 2014, maybe even Skylake in 2015. Can you say you'll get 4 years of top-level, all-things-on-and-maxed, and (eventually once I buy the monitor) 2560x1600 gameplay out of that Fusion_ Obviously not. And obviously for the latter, you'd need a pretty good videocard, too - a REAL videocard, not that currently-two-generation-old Radeon in the A10-5800K.

View PostOneMoar, on November 13 2012 - 08:45 AM, said:

also comparing a dual core to a quad core ... lolwut 70FPS at 1024x768 awwwwwwwwwesome I was running at over 90FPS @ 720p on a phenom II in  ... 2009 maby if you hadn't had blown all your money on a intel cpu you could have had a haf way decent gpu AND a quad core
Yeah, except that you failed to notice again that I wasn't playing at 720p - I was playing at 2048x1152. Yes, I played above 1080p, and I still got 75 FPS.

Go ahead, check the picture resolution. I'll wait here. I've got nothing to hide.

Your "over 90 FPS @ 720p" is worthless, because I bet if I would've toned it down to that resolution, I probably would've been doing something like 120. Though then again, I've also learned that over 60 FPS is useless on LCDs, so while I'll never go for framerates that high, I'll have much smoother gameplay.

Are you still going to crow about a 15 FPS higher framerate when my monitor pushed basically two and a half times more pixels more than you did (on a old GTX 285 that's now retired, at that) for that slight loss, which was still well north of 60 FPS_ I doubt it.

View PostOneMoar, on November 13 2012 - 08:45 AM, said:

this was my 2009 era Rig iirc
Amd Phenom II 940@ocd to ... 3.4Ghz iirc
4GB DDR3 1333
ATI 4870
And back then, I had a Core 2 Duo E8400 (which was an upgrade from the E6750 - had wanted a Q9450 but my motherboard couldn't do the 45nm quads, it could do the 45nm duals though), 4 GB of DDR2-800, and a GeForce GTX 285. The 4 GB was already there from when I had to get new RAM (one stick of my 2x1 GB sticks died so I got 2x2 GB); the CPU + GPU upgrades cost me about, I don't know, $450 or so. CPU was like $189, and I think the videocard was about $269.

Yours ran UT3 at 90 FPS at 1280x720, on a quad-core CPU, presumably max details. Mine ran it at 75 FPS at 2048x1152, on a dual-core CPU, max details. I pushed 2 1/2 times more pixels than you did, on a CPU with half the cores of yours, that ran 400 Mhz slower than yours, for a net loss of 15 FPS compared to you.

1280x720, Max Details, 90 FPS vs. 2048x1152, max details, 75 FPS.

Which of those do you think most PC gamers, in 2009, would want_

View PostOneMoar, on November 13 2012 - 08:48 AM, said:

I am about done with this thread its turned into a Intel vrs amd flamewar
at the end of the day AMD can still game just as well as intel and thats all that matters
its also rather Unfortunate that the devs picked UDK ... intends of something like UNITY
unreal 3 will always be a rather slow lumber clunky giant with really pretty lighting
I wouldn't call it slow or clunky. What it is is that right now these builds aren't exactly focusing on optimization. I'm pretty sure if I run literally almost any other UE3 game, it will easily stick near 60 FPS for me and for a lot of other people, too, or at least be high enough to be enjoyable.

Though, obviously, you have a better chance on an nVidia card (it is a TWIMTBP title after all) and an Intel CPU, for reasons I've already said.

Edited by DarkPulse, November 13 2012 - 11:27 AM.

Reason as my minor ego, and opposite my desire to be a murderer.
A coagulated, gloomy thinking in the intelligence, as my major ego.
An antinomian theorem of behaviorism, in all of my thinkings.
It's what we call "The Inversion Impulse."

#42 Elix

Elix

    Good Guy Elix

  • Members
  • PipPipPip
  • 4,228 posts
  • LocationFred's cockpit

Posted November 13 2012 - 11:56 AM

I think this thread has officially gone past the point of no-return into off-topicness.
HAWKEN Community Values (updated!)

ETA for $feature_you_want to be added to Hawken Open Beta: Imminent™
See someone breaking the rules_ Don't reply, just hit Report. I am a player, not staff.
Drinking game: Check the daily stats. If I'm not the top, DRINK! (I'm joking!)

#43 DarkPulse

DarkPulse

    Ghost Liner

  • Members
  • PipPipPip
  • 2,243 posts
  • LocationBuffalo, NY, USA

Posted November 13 2012 - 04:47 PM

Probably, and partially my fault.

The long and short of it is that the devs have yet to fully optimize, but even when they do, your best bets for framerate don't lie with AMD. :P
Reason as my minor ego, and opposite my desire to be a murderer.
A coagulated, gloomy thinking in the intelligence, as my major ego.
An antinomian theorem of behaviorism, in all of my thinkings.
It's what we call "The Inversion Impulse."

#44 OneMoar

OneMoar

    Advanced Member

  • Members
  • PipPipPip
  • 66 posts
  • LocationBatavia Ny

Posted November 13 2012 - 08:42 PM

we aren't talking about OTHER games we are talking about THIS game a UT3 'mod'
I know the full well that the apu isn't a gaming chip but its enough for this and whatever other UT3 based games someone might wanna play
you where also running in DX9 I was running in DX10 with probly like x4 AA 16AF your 280 was also faster then my 4780
and your math is off 209+309+120 for a decent motherboard
where as  you could get a
FX6300
8Gbs of ram
and 7870[2GB] (thats faster then the 660TI btw) even moar so with AMD's next driver overhaul 20 and 30% gains across the board ftw
AMD: http://cl.ly/Ksie/Im...11.38.29 PM.png

Intel: http://cl.ly/KspF/Im...11.40.50 PM.png
the 7870 Ghz edtion is faster then the 660ti
the intel is 130 moar
are you willing to pay 130 bucks for a extra circumstantial 15-20FPS (in cases where you are cpu limited and thats pretty much never now adays ....)_ Id call you crazy if you where
above about 70Fps its all moot and both systems WILL deliver that consistently period ...
and in cases where game devlopers accually know what they are doing and can use the extra threads+AVX instend of slower x87 math witch the AMD chips ARE weaker at due to the lack of FPU's the AMD chips pull right along side intels top offerings infact the difference was so great that some benchmark sites had to run the tests again because the difference was so huge and that massive gain is just a compiler flag away  ... well a bit more the that but not a whole lotta effort
http://www.tomshardw...0fx,3043-5.html

so go with the "cheaper" option
/thread

Edited by OneMoar, November 13 2012 - 08:57 PM.


#45 DarkPulse

DarkPulse

    Ghost Liner

  • Members
  • PipPipPip
  • 2,243 posts
  • LocationBuffalo, NY, USA

Posted November 14 2012 - 01:37 AM

View PostOneMoar, on November 13 2012 - 08:42 PM, said:

we aren't talking about OTHER games we are talking about THIS game a UT3 'mod'
I know the full well that the apu isn't a gaming chip but its enough for this and whatever other UT3 based games someone might wanna play
you where also running in DX9 I was running in DX10 with probly like x4 AA 16AF your 280 was also faster then my 4780
...Now you're just talking out your ass. UT3 didn't even have a DX10 mode.

I know for a fact for AF I ran at 16x. AA would be pointless on UT3 since you would have to force MSAA for it to work and that would tank performance, and not even work 100% correctly due to deferred lighting. There's no way you're going to honestly get me to believe that.

View PostOneMoar, on November 13 2012 - 08:42 PM, said:

and your math is off 209+309+120 for a decent motherboard
where as  you could get a
FX6300
8Gbs of ram
and 7870[2GB] (thats faster then the 660TI btw) even moar so with AMD's next driver overhaul 20 and 30% gains across the board ftw
AMD: http://cl.ly/Ksie/Im...11.38.29 PM.png

Intel: http://cl.ly/KspF/Im...11.40.50 PM.png
the 7870 Ghz edtion is faster then the 660ti
the intel is 130 moar
are you willing to pay 130 bucks for a extra circumstantial 15-20FPS (in cases where you are cpu limited and thats pretty much never now adays ....)_ Id call you crazy if you where
You'll note I didn't mention a motherboard, so my math "isn't off." You're also not mentioning the motherboard in yours, either, for what it's worth - then again, I'm not mentioning the RAM, and you're not mentioning you need a 64-bit OS to make use of the 8 GB, so let's not nitpick, shall we_

PS: As for your "7870 is faster than the 660 Ti_" This site disagrees. I'll just paste their summary here: "In most games, the GeForce GTX 660 Ti was between 12% and 20% faster than its main competitor, the Radeon HD 7870 GHz Edition. There were some games in which both achieved the same performance, and on one, the GeForce GTX 660 Ti was 47% faster. There was no scenario where the GeForce GTX 660 Ti was slower than its competitor."

View PostOneMoar, on November 13 2012 - 08:42 PM, said:

above about 70Fps its all moot and both systems WILL deliver that consistently period ...
and in cases where game devlopers accually know what they are doing and can use the extra threads+AVX instend of slower x87 math witch the AMD chips ARE weaker at due to the lack of FPU's the AMD chips pull right along side intels top offerings infact the difference was so great that some benchmark sites had to run the tests again because the difference was so huge and that massive gain is just a compiler flag away  ... well a bit more the that but not a whole lotta effort
http://www.tomshardw...0fx,3043-5.html
Devs aren't going to use AVX. The number of processors that even have it are small as hell, and it's more code they have to debug. If anything, they're probably only just barely starting to move into SSE4. Then again, I could be wrong and it might use AVX all along. Only they know what they compiled the EXE with, but for all I know_ They might be trying to make sure it reaches all the computers it can, in which case, it might top out at SSSE3 or even SSE2.

View PostOneMoar, on November 13 2012 - 08:42 PM, said:

so go with the "cheaper" option
/thread
And if you do, then don't wonder why when you try another game, it doesn't run so good. Until AMD can get their act together, I'm going to call for spending the extra money, because most gamers don't want to play just UE3. They want to play other games too. And in those other games, AMD is not an option. There's a reason the number of AMD processors are falling every month on Steam...

Edited by DarkPulse, November 14 2012 - 01:38 AM.

Reason as my minor ego, and opposite my desire to be a murderer.
A coagulated, gloomy thinking in the intelligence, as my major ego.
An antinomian theorem of behaviorism, in all of my thinkings.
It's what we call "The Inversion Impulse."

#46 Saint_The_Judge

Saint_The_Judge

    Advanced Member

  • Members
  • PipPipPip
  • 368 posts
  • LocationThird World

Posted November 14 2012 - 04:10 AM

Well, after a painful CB2, way worse than previous phases (regarding to low FPS), I do hope they work a lot on the optimization, because 12.12 is near.
Once a girl asked me in a chat: "-ASL_" I answered: "- Very old, impotent, third world." And she got out the room. Posted Image

#47 OneMoar

OneMoar

    Advanced Member

  • Members
  • PipPipPip
  • 66 posts
  • LocationBatavia Ny

Posted November 14 2012 - 08:45 AM

1. UT3 most certainly had DX10 you had to enable it via config switch but it did have it  "offically supported in one of the later patches where it enabled TRUE-HDR
AllowD3D10=True

All sandy bridge+ and AMD bulldozer + chips have AVX /w FMA3 And you are right else they can fall back to SSE4A its just a matter of flicking a compiler flag and making to binary if devs aren't willing to go that far then they need to uninstall and its not more code to debug lol NOW whose talking out there anal cavity _

3. there is no game on this planet thats going to be unplayable or have issues or suffer game impacting performace because you have a AMD proc that's just plain old FUD and stupidity at its finest  and something I take issue with

4. ill grant you the point on the 660ti i was looking at the wrong benchmark my fault for arguing with a n00b/fanboy at 2AM the 7870 is about ~10fps slower on average /w the CAT 12.11 drivers ...  http://www.techpower...ormance/13.html

5. and if you think Intel's cpu's are expensive now wait until AMD goes under .....

Edited by OneMoar, November 14 2012 - 08:47 AM.


#48 Raident

Raident

    Member

  • Members
  • PipPip
  • 15 posts
  • Locationphilippines

Posted November 14 2012 - 02:43 PM

iirc, 660ti is a direct competitor of 7950...  same goes with the 660 non ti and the 7870 respectively.

Main Rig : i5 2500k 4.0 @ 1.248 | Water 2.0 Performer | 7950 3gb flashed | G.Skill Ripjaws 4x2 1600 | Coolermaster GX650w Bronze | asus p8z68-M PRO | NZXT Source 210


#49 DarkPulse

DarkPulse

    Ghost Liner

  • Members
  • PipPipPip
  • 2,243 posts
  • LocationBuffalo, NY, USA

Posted November 14 2012 - 04:27 PM

View PostOneMoar, on November 14 2012 - 08:45 AM, said:

1. UT3 most certainly had DX10 you had to enable it via config switch but it did have it  "offically supported in one of the later patches where it enabled TRUE-HDR:
AllowD3D10=True
While the switch is there, there was very minimal difference between DX10 and DX9. All DX10 let you do was force AA (which, again, didn't even work half the time) and no official support for DX10 was ever extended, or patched in. All that existed was an INI tweak that put the engine on a mostly-unsupported rendering path. I used to (and actually, technically still do) do pieces for one of the big Unreal news sites, BeyondUnreal; I was one of the people who got access to beta patches for that game and even the Titan Pack content well before it ever was public. Believe me, I'd remember if it happened.

Therefore, no, I do not consider it to run in DX10. UT2004 had a DX9 renderer and even a 64-bit patch, but needless to say, both had comparability issues (and the DX9 renderer was later removed from future patches, then another patch broke it completely even if you still had it from older patches). If the devs wanted it to have a DX9 or DX10 option, they'd include it in the options menu or have the game detect what you're capable of and auto-run down that engine path accordingly. Hawken has a DX11 INI option, but unless the devs actually support it (please do so, devs ♥) then it's not going to count as supported. There are plenty of Unreal Engine games with the switch that will exhibit strange, glitchy behavior, or even crash, if they're enabled, so the mere presence of the switch does not automatically mean "supported and working."

That said, admittedly at the time I was still on Windows XP (as I refused to upgrade to Vista until SP1 came out and benchmarks proved the gaming disparity had been nullified). If I had, it's very well possible I would have had a better framerate if I had been on a DX10 level system at the time, but even with my "penalties" the fact I still passed 60 FPS and still did it at 2 1/2 times your resolution mean that mine would be the superior choice for gamers. Smoothness is important, but nothing wows a gamer like massively high-def resolution, especially when it's over 1080p.

PS: "True HDR" also works in DX9. Or are you forgetting games like Elder Scrolls IV: Oblivion, Half-Life 2: Lost Coast, etc...

View PostOneMoar, on November 14 2012 - 08:45 AM, said:

All sandy bridge+ and AMD bulldozer + chips have AVX /w FMA3 And you are right else they can fall back to SSE4A its just a matter of flicking a compiler flag and making to binary if devs aren't willing to go that far then they need to uninstall and its not more code to debug lol NOW whose talking out there anal cavity _
Wrong; Intel does not have FMA3 yet. (Those will be supported starting with Haswell, next year.) Therefore, AMD has the theoretical edge with this for now. However, just because it uses the newest instruction sets, it does not always magically translate to better performance. The compiler used also has to support them, naturally, and we don't know what the devs are compiling with (I'd bet VS2010 though). Furthermore, Haswell will also support AVX2 - which AMD currently does not.

The simple fact is, though, a lot of people are still on CPUs that support none of these. A game either has to have fallbacks or refuse to run. Fallbacks mean reduced performance, and there's still a lot of people on old Phenom IIs or Core 2 Duos and a few which are on even older hardware like Athlon 64s or Pentium Dual-Cores. The problem is while all of them can upgrade to gain performance, Intel will always gain some with each generation, while AMD has essentially been stagnant since Phenom II x4s. (Again, the first chip to perform better than later-era Phenom II x4s was the FX-8350... and that doesn't come with an integrated Radeon). Your solution might help the people on older hardware, but for Phenom II x4 users, it helps nobody at all unless they slap in a FX-8350... and in that case, it becomes an extremely minor speed boost - which still lags woefully behind Intel.

The only people your solution appeals to are really the absolutely broke gamers, but as I pointed out, for a bit more money, you get a lot more performance. You feel it doesn't matter, but the fact of the matter is most people want their money to last as long as possible. Therefore, while that A10-5800K might run UE3 well, I don't see it doing very good in Crysis, or Planetside 2, or pretty much any game that uses DX10+ as a minimum and has no DX9 fallback as that's when graphical fidelity takes a big step up. In short, within 18 months they'd need to upgrade again to get acceptable performance in a new game - so what's the difference between spending $130 twice in 36 months, and $300 or so once and having it go 3-4 years_ At worst, a $40 loss which is nothing; at best, something that'll last quite awhile and be even stronger if you buy a second card down the line and SLI it.

View PostOneMoar, on November 14 2012 - 08:45 AM, said:

3. there is no game on this planet thats going to be unplayable or have issues or suffer game impacting performace because you have a AMD proc that's just plain old FUD and stupidity at its finest  and something I take issue with
Never said it'd be unplayable, I said it'd perform worse - there's a difference. And yes, they do perform worse. Numbers and benchmarks do not lie; it's their job to sort out performance differences and point them out.

They definitely give less performance for the money, so I feel they're bad buys, especially with the minimal performance gain from generation to generation compared to Intel.

View PostOneMoar, on November 14 2012 - 08:45 AM, said:

4. ill grant you the point on the 660ti i was looking at the wrong benchmark my fault for arguing with a n00b/fanboy at 2AM the 7870 is about ~10fps slower on average /w the CAT 12.11 drivers ...  http://www.techpower...ormance/13.html
Ahh, Metro 2033... one of the few games that brings even my rig to its knees. (Though maybe not if I double up on 690s, like I eventually might...)

Also, it's not me being a noob/fanboy, so thanks for the passive-aggressive remark. I've been building computers for the last ten years and I've always went where the money is. When Intel and nVidia were weak (Pentium 4/GeForce 5000s), I went with an AMD/ATI system. (That Athlon XP 3000+ had a 9650 XT in it, IIRC, which eventually got displaced by an nVidia 6800 GT). When AMD is weak in CPU and GPUs, are you suggesting I should not point out that people who spend a little more money can get significantly higher performance_ Your solution is fine; I'm not saying it's horrible - just saying that for a little bit more money, people can have something much better, without it "breaking the bank" so to speak.

One more thing to note: That game's benchmark, as their test bed shows, was on a Core i7-3770K. I can guarantee you numbers would be down across the board on any AMD processor.

Lastly, while I do think Radeon cards are pretty competitive once AMD pushes out updates, I also don't like how long it takes AMD to put out those updates. Compared to their rivals, it's absolutely molasses-slow, and Radeon owners often have to wait months for a game to have proper Crossfire and Eyefinity support, whereas an nVidia owner has a much higher chance of having SLI and 3D Surround on day one. (And yes, this will include Hawken - their latest beta drivers added both of those.)

View PostOneMoar, on November 14 2012 - 08:45 AM, said:

5. and if you think Intel's cpu's are expensive now wait until AMD goes under .....
Ha, you obviously weren't around for the days when a mid-range CPU cost you upwards of $350. I was.

Needless to say, when I remember those days, $329.99 for a top-of-the-line processor is a damn steal. Six years ago, the minimum gaming CPUs cost about $250; you can now get that for about $200 or even a bit less (and yes, they can still be Intels as I pointed out).

AMD has the market cornered on lower prices in general, and that's a good thing, but the simple fact is that when it comes to their direct competitors, AMD only wins a handful of scenarios:
  • Anything extremely heavily multithreaded, where the singlethread performance is much less of a factor.
  • Integrated graphics. (Though I bet if Intel ever absorbed nVidia - not that I think they will - this would be a very, very interesting race. That said, Intel's constantly working to improve their IGPs, too, so we'll see where this goes.)
  • The extreme low end, which is irrelevant to gamers.
AMD is working on refocusing itself to push more into the mobile market as well as get back into servers (you don't see Opteron thrown around anymore, do you_), so while they won't leave the home PC business, it not being their focus means that inevitably there's just not going to be as many brains behind it. Intel, on the other hand, has the opposite problem - they've been trying to get into the mobile business (They're pretty established in servers) but have failed due to the overwhelming presence and improved efficiency of ARM.

Do you see the parallels here between ARM/Intel in mobile, and Intel/AMD on desktop_ It shouldn't take too much thinking.

AMD will, unless they make some serious breakthroughs, go back to the role they occupied since the late 80s/early 90s - a "second banana" to Intel. They'll still sell processors, and obviously now they have name recognition like Intel (as opposed to stuff like, say, Centaur, VIA, etc. that you'd only recognize if you were a PC geek), but the simple fact is that eventually, people buy what they feel will suit them best. You see Intel processors proudly touted in all sorts of laptops and new desktops. You don't see that with AMD.

And that's pretty much the problem - AMD simply isn't competitive enough. Ergo, they're going to be a lower-range option, but anyone who wants performance and knows their stuff, I fail to see any reason why they'd pick an AMD now, and for the foreseeable future I don't see that changing one bit.

Edited by DarkPulse, November 14 2012 - 04:31 PM.

Reason as my minor ego, and opposite my desire to be a murderer.
A coagulated, gloomy thinking in the intelligence, as my major ego.
An antinomian theorem of behaviorism, in all of my thinkings.
It's what we call "The Inversion Impulse."

#50 OneMoar

OneMoar

    Advanced Member

  • Members
  • PipPipPip
  • 66 posts
  • LocationBatavia Ny

Posted November 14 2012 - 10:00 PM

needs moar wall of text
/me continues trolling
amd has stated it them selfs they have no intention of competing with intel
as for the rest of your wall of text I board with picking your posts apart one section at a time

Edited by OneMoar, November 14 2012 - 10:05 PM.


#51 DarkPulse

DarkPulse

    Ghost Liner

  • Members
  • PipPipPip
  • 2,243 posts
  • LocationBuffalo, NY, USA

Posted November 17 2012 - 07:57 AM

View PostOneMoar, on November 14 2012 - 10:00 PM, said:

needs moar wall of text
/me continues trolling
amd has stated it them selfs they have no intention of competing with intel
as for the rest of your wall of text I board with picking your posts apart one section at a time
Precisely, and the fact they don't want to compete with Intel is why any serious gamer - the kind who will be buying a CPU and a GPU - should not be buying an AMD chip.

The only reason you ever should is if you're extremely, extremely broke. Otherwise, it's better to spend a little more for a lot more power. Of course, if your PC is about just general use, AMD are great buys. But once you get into gaming or power users, it's not really a contest unless you're absolutely dirt broke or dirt cheap.
Reason as my minor ego, and opposite my desire to be a murderer.
A coagulated, gloomy thinking in the intelligence, as my major ego.
An antinomian theorem of behaviorism, in all of my thinkings.
It's what we call "The Inversion Impulse."

#52 Moderator03

Moderator03

    Community Moderator

  • Moderators
  • PipPipPip
  • 148 posts

Posted November 18 2012 - 08:16 PM

Please keep posts constructive and inviting.

You can refer to HAWKEN Rules and Guidelines >>>HERE<<<

Thank you,
The HAWKEN Moderation Team




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users