1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

In defence of multi-core

Discussion in 'Article Discussion' started by WilHarris, 5 Nov 2006.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I'm speaking from a gamer's perspective here and I'm not addressing 2D application usage at the moment...

    Personally, I think the performance increases as a result of multi-cores (ala Valve's methodology) will mean a more immersive gaming experience. Single-core users will have what we've got now, and dual or quad-core users can expect more particles, more physics-related effects (Google for Alan Wake IDF video to see what I mean), generally more realistic AI (take F.E.A.R. as a starting point here) and also more realistic physics in general.

    I imagine that we could use one of the cores to handle a weather system in-game (like the one in Alan Wake) - something needs to be used to calculate exactly when and how the weather changes in a realistic manner. Realism isn't just about 'cinematic graphics', it's about everything else that comes with it. That's why we've seen Ageia try and penetrate the market with a PhysX PPU - unfortunately though, there just isn't the content out there at the moment... :)
     
  2. p3n

    p3n What's a Dremel?

    Joined:
    31 Jan 2002
    Posts:
    778
    Likes Received:
    1
    Anyone who has had any form of programming training will have heard of concurrency, it is the best solution, end of story.
     
  3. Fozzy

    Fozzy What's a Dremel?

    Joined:
    25 Jan 2005
    Posts:
    1,413
    Likes Received:
    2
    My only issue with development is with it's direction. Yes both intel and AMD are makeing faster chips. We all know this. I'm worried about power consumption. The fact that only two years ago a solid 300w psu could run a top of the line gaming system just baffles me. Now if you want all of the bells and whistles you need at least a 750w psu and preferably a Killowatt psu. I'd like to see architecture that makes sense and diminishes power consumption. I don't mind buying a bigger PSU but hat I do mind is when Graphics cards jump from needing 65 to 165w in less than 2 years. I just think things are getting a little out of control.
     
  4. Warrior_Rocker

    Warrior_Rocker Holder of the sacred iron

    Joined:
    26 Jun 2005
    Posts:
    938
    Likes Received:
    1
    I once owned a compaq proliant 1850r which ran 2 400mhz Penitum II processors. In raw benchmarks it could sometimes outpreform a 1ghz pentium III based system.

    The point is that: Multi-tasking benifits from multi-core technology. Now we can double the amount of instructions able to be executed per every double in the amount of physical cores. Intel tried to pass off its HyperThreading technology long ago. Though it is no substitute for having two physical cores over two virtual cores.

    The whole argument is that instead of trying to make the chips faster and faster that we take the fast chips and put them onto a single package. But it is clear in the benchmarks that this has been one of the best moves by both chip makers to increase performance in recent.

    The thing about it is in my opinion which makes the mutlicore technology great is: The multicore technology can scale with the application. If for some reason your application needs to execute 8 instructions at the same instant, in the near future it could.
     
  5. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    graphics cards are going to get a bit better after this next generation - the shrink to 65nm and then 45nm is going to help with power consumption.
     
  6. Ringold

    Ringold What's a Dremel?

    Joined:
    27 Jul 2006
    Posts:
    48
    Likes Received:
    0
    On 2ghz 'by definition' being 2x as fast as 1ghz on the same architecture: I'd challenge you to find out, as I did indirectly through undervolting my laptop, to pick an application, such as WinRAR, or a benchmark such as SuperPi, and test your CPU at stock and then again by dropping the multiplier by half. The results for me weren't 50% of original performance. Playing with the FSB instead leads to even more skewed results.

    Power consumption wise, I dont see it as the numbers mattering at all. Gas used to be 10c/gallon, yet today depending on the headlines on that given day it can be 2 to 3/gallon USD, is that out of control? Not really, since as a percentage of personal income, it constitutes a smaller portion of expenses than it used to. Likewise, once the long term trend of 'more for less' restores itself in the energy markets (post-fossil fuel era), I don't see a problem with a 1 mega-watt computer as long as it doesn't cost much more in real terms than one does today. Why should it? Maybe I miss that argument entirely :\

    I think this whole "it's not native quad core!" thing though is just anti-Intel sentiment. Yeah, it showed poor quality when X2's were native and then Intel lagged with non-native dual-core, but since they're first to market with a product that simply works and is simply better than what anyone else in the market can even hope to offer at this point in time, and can therefore serve a market exclusively, how can anyone complain? There's two options if someone needs/wants quad core: Praise it as the performance king that it is, or don't buy it. How can something be bashed for being first-to-market? People that need it benefit by not having to wait another 1 or 2 or 3 quarters for a slightly better product, and Intel benefits by profiting from those that benefit by buying it. Beautiful arrangement, imho. :)

    Edit: As a follow up to my first paragraph; power consumption also did not fall in a linear fashion. For both points though, don't ask me why, economics major because that engineering and physics stuff scared me, but I can't debate my own tests results.
     
  7. smoguzbenjamin

    smoguzbenjamin "That guy"

    Joined:
    6 Sep 2004
    Posts:
    1,097
    Likes Received:
    1
    Yes, point taken. However, as I understand it the software needs to be coded to take advantage of this. As far as I'm aware this is difficult to do, and moreover isn't being done very often yet. Adobe's Photoshop has a multi-core adaptation, but aside from future gaming I haven't heard of the expansion yet. Does WinXP have a multi-core extension? I'm sure Vista must have some form of multi-core implementation, but I read that only the enthiusiasts are going to be buying into that anytime soon anyway.

    I wonder when developers will realise that people are using multi-core CPUs and code for them. As for the technology itself, I believe it is an interesting way forward, the idea of performing physics calculations/AI calculations/etc on seperate cores is quite entertaining. But then I hit a mental wall again by thinking "but it needs to be coded to do that".
     
  8. Ringold

    Ringold What's a Dremel?

    Joined:
    27 Jul 2006
    Posts:
    48
    Likes Received:
    0
    From a pure, virgin, clean boot of Windows, looking at Task Manager, all of those modules active do different things for the OS. Likewise, they can all be punted to different cores. Some of them might be multithreaded in such a way to use two or more cores themselves, but even if not, thats one type of multithreading described in the Valve article. Vista would be no different, but if I'm not mistaken more work did indeed go to making it more multi-core friendly, not to mention the innovation of offloading GUI tasks to the processor that does graphics best; the video card.
     
  9. Kipman725

    Kipman725 When did I get a custom title!?!

    Joined:
    1 Nov 2004
    Posts:
    1,753
    Likes Received:
    0
    I'm sorry but I don't aggree. Multi cores are great for AP's like F@H and for multi tasking to some extent (although I multi task fine with much more modest hardware). But software which has in recent times taken advantage of new hardware first, games. Just dosen't scale well from more than one or two cores as most of the code needs to be computed in a particular order and it's an absolutle nightmare to calculate bits on different cores and manage to get it all in the right place at the right time. (sorry if I'm not very elequant here I am a big noob at programing). untill some thought has been put into making it easier to right multi threaded software (Eg. some kind of uber compiler that can convert single threaded code into multi threaded code with the click of a button and with a performance inprovenment) then I don't think they will be a worthwhile purchase. I say this because even years after release the fastest single core cpu's are almost as fast as there duel and quad core bretherin despite the more than fourfold theoretical increase in power.
     
  10. aggies11

    aggies11 What's a Dremel?

    Joined:
    4 Jul 2006
    Posts:
    180
    Likes Received:
    1
    Well, a 2x as "fast" (ignoring Mhz for the moment) or "slow" CPU does not directly translate to 2x overall computer performance. Remember, the CPU is only one piece of the puzzle. Time is also spent in disk access (hard drives), accessing memory (memory speeds) etc. There are a myriad of other factors/components that contribute to overall task speed. In order to slow the entire computer down, every single component would have to be slowed down by 2x also.

    Which is why it becomes harder to isolate the impact of improvements, and specific benchmarks have to be created/used for just that sort of thing. Try to isolate every other component except for the CPU for example. But it's tricky.

    So the Winrar results in particular, which definitely would use both disk-access and memory access, are not necessarily going to be the best example, in either direction. You'd have to know exactly how much of each it requires. That doesn't neccessarily disprove your point, it just makes things alot less cut and dry :(

    "I wonder when developers will realise that people are using multi-core CPUs and code for them."

    Alas if only it were as simple as that. It's not merely a case of "code your program to support multi-threads and you instantly have all the benefits of multi-core". Ignoring the fact that it's rather difficult to go concurrent/paralelize, it's alot of work, conceptually difficult, and very easy to make serious mistakes. It's often *impossible*. Thats a harsh word of course, and it would probably be better put as "not worth the minimal benefit your application would see". But the reality is certain (alot) of tasks simply do not lend themselves well or at all, to the concurrent/parallel strategy. You simply just can't break them up into smaller chunks that can be completed at the same time. As Wil put from a previous article, they are inherrently "serial". The traditional game/3d rendering tasks are largely like that, and so are a great many more.

    Not to get into the nitty gritty, but it's very often the case that it's not simply a matter of "having your programmers write/design multithreaded". It's just that it doesn't actually apply at all.

    Aggies
     
  11. smoguzbenjamin

    smoguzbenjamin "That guy"

    Joined:
    6 Sep 2004
    Posts:
    1,097
    Likes Received:
    1
    This gets more confusing every time I think about it. If most computer programs are of a serial nature, why is everyone so hyped up about all this? Sure, running physics calculations on an extra core is a hell of a lot cheaper than buying a PhysX PPU (which has little to no support games-wise), but apparently no-one is going to use this technology because it is a serious challenge software/concept-wise. That, and I haven't seen any major game except for GRAW and UT2007 make use of the PPU tech.

    Right that settles it for me, I am personally opting for the fast single-core CPU, I don't convert AVIs to MPGs while playing games and recording stuff off of TV anyway.
     
  12. Ringold

    Ringold What's a Dremel?

    Joined:
    27 Jul 2006
    Posts:
    48
    Likes Received:
    0
    But.. but..

    Okay, future proof, a single core apparently won't be able to touch a game like Alan Wake, and would probably have a rough time in a game like Supreme Commander, so thats coming in a time frame that should really be considered for a purchase made today, since to be honest at most price points you have to *try* to avoid dual core purchases..

    But outside the game realm.. Except for Word, I can't think of a category of application that doesn't have a product in existence, on the market today, that doesn't take advantage of dual core.

    Compression: WinRAR, 7z
    Video work: What *doesnt*?
    Audio work: Again, what doesn't use it?
    Image work: Photoshop does
    Programming: Most compilers are
    Burners/Rippers: Again, most are

    There's other categories I'm coming up short on that use CPU time I'm sure (QuickPar or IceECC, DosBox, Maya, povray) use SMP/SMT, but again, if you look, I'd bet an app exists that uses it. And likewise, I bet its also freeware, heh.

    And the gains aren't trivial.. I wouldn't go quad core, certainly, but on AMDs the benefit from that second core is especially tangible, heck, even when just navigating the 'Start' menu.

    You're climbing a pretty steep mountain trying to pick on dual-core rather than quad-core, unless your budget is extremely constrained. Extremely.
     
  13. tank_rider

    tank_rider What's a Dremel?

    Joined:
    3 Feb 2005
    Posts:
    1,090
    Likes Received:
    6
    I for one have seen a huge improvement in performance when moving to a dual core system. I am in a fairly minority case though where multi cpu/cluster computing was one of the main considerations when the software i use was beinc written. I'm talking about most engineering analysis software, both the FEA and CFD software i use is multi-threaded due to the majority of real engineering situations using either dual cpu or clusters of computers.
     
  14. aggies11

    aggies11 What's a Dremel?

    Joined:
    4 Jul 2006
    Posts:
    180
    Likes Received:
    1
    First off, to be fair, I'm only covering the criticisms of multi-core, I'm ignoring the benefits (there are some *gasp*!), because they've already been covered to death. My only beef is that there is a bit of "rosey coloured glasses" thingy going on when people talk about multi-core, they see all the benefit but seem to miss that there are *costs* to doing it this way?

    The desktop is going multi-core, so the hype machine has started. It is being h yped up because AMD and Intel won't sell many chips with a slogan like "Multi core - better than single core, sometimes"® They will have to extole it's virtues, because it's their bottom line.

    We are seeing these chips because they are cheaper/easier to make/develop. Plain and simple. It costs $100(CAD) more for an AMD X2 3800+, than an AMD A64 3200+ (2x2GHz vs 1x2GHz). Yet if the 2x2GHz=4GHz math holds true, your getting a 2GHz chip for only $100 more. Doubling the performance! At such low prices! The world just isnt' that wonderfull a place. When it giveth with one hand, it taketh with the other ;). Multi-core chips take advantage (require!) of the fact that you can do two things at the same time. Thats twice the performance, as long as the thing you are doing can be broken up in such a way that you can do "two(or more)" subsections of it at the same time. If your computing problem (program) can be broken up in that sort of way, then you're laughing. If not (a large chunk of tasks fall into this category), then the benefits of multi-core are largely lost to you.

    All this being said, I have a dual core chip on the way (x2 4400+). There are benefits. An extra core never hurts :) But I know where it is realistic to expect improvements, and where it's not. n slower cores are not the same as a single faster core. But they typically are cheaper :)

    Aggies
     
  15. Fozzy

    Fozzy What's a Dremel?

    Joined:
    25 Jan 2005
    Posts:
    1,413
    Likes Received:
    2
    The reason there are "rose-covered-glasses" going around is due to the fact that we have such a bright horizon in front of us. Until recently gains have been fairly linear in terms of performance. Now we are seeing our gains doubled by adding cores instead of ghz BUT only when programs are coded for the extra codes. that is why quadcore is so interesting. To programmers it didn't matter if they coded for dualcore. Why would they when so many users still used single cores? Now with intel's new release, developers have been told what they need to do to make their programs run better. They are being told to get ready for multiple cores which will net the most gains. We are already seeing the gains in other products as well.

    Take the X-box 360 which has 3 cores running at something like 3ghz each. A single xbox that costs only $300 can run games with similar graphics to $2000 computer. Multiple cores is the future and intel is forceing the development upon software, and hardware designers so that everyone can catch up.

    I personally don't think that a quadcore will game much better than my 3800x2 but when it does you can bet that I will upgrade. Hopefully by the time they figure out multi-core gameing somebody will have released an 8-core super processor than only needs 65w to run. That will make this young man smile for days.
     
  16. Skutbag

    Skutbag What's a Dremel?

    Joined:
    17 Dec 2003
    Posts:
    317
    Likes Received:
    6
    Not to say it won't be useful eventually, but its also a marketers wet dream.

    'Lets see, what would sell more than a dual core chip? A quad core?! That's like, a whole TWO more!'

    Insert comedy ka-ching eyes here
     
  17. specofdust

    specofdust Banned

    Joined:
    26 Feb 2005
    Posts:
    9,571
    Likes Received:
    168
    It is indeed a marketers dream. But as was pointed out in Wil's linked videocast thingy, there'll be a point at which the controlling of the chip, the limited FSB, the limited other components would get in the way and it just wouldn't be worth adding more cores. I can't see that with todays architecture 20 cores on one chip could be usefull, the infrastructure that supports so many individuals just isn't there. That's gonna be the real limitation in my opinion.

    Intel say they can do 80 cores on one chip in 5 years and I wouldn't be at all surprised. What I will be surprised at is if they can make the systems those 80 cores have to interact with support 80 individual cores doing 80 individual things with 80 seperate needs all going to the same ram across the same FSB through the same northbridge. We're gonna need processors just to control and process our cpu's requests. Thats the hardware wall I think, and that's what'll slow down this beast in the medium term I reckon.

    As for the whole programs in serial thing. I guess that's the case for now. But the world has been programing mostly in serial for pretty much the history of PC's. Given the speed at which the computer industry and technology advances, I don't think we should expect to wait long before we see some real advances. Perhaps programming languages designed with the express intent of making parellism work, or just a greater understanding of what can be split up and how to make things work cohesively.
     
  18. scarecrow

    scarecrow What's a Dremel?

    Joined:
    4 Mar 2006
    Posts:
    110
    Likes Received:
    0
    Guys Multi-Cores is nothing new though. Its just in a nice small package. Super Computers have always been multiple cores with a **** load of ram its just that now it doesn't take a room to fit these computers and in the future a super computer with 80 machines running in parralell will be in the size of your hand. Sound familiar like the old computers that filled a room to do 10 times 10 and now cellphones are more powerfull then them. Its exactly what is said that the high end brings everything up with it so as the high ends get better the norm gets better.
     
  19. Ringold

    Ringold What's a Dremel?

    Joined:
    27 Jul 2006
    Posts:
    48
    Likes Received:
    0
    I agree, 20 cores with a 1066mhz FSB will mean just a bunch of useless cores with heavy loads, but that suggests technology will advance in numbers of cores but no where else.

    Already, AMD has left behind the traditional FSB which Intel still uses. Likewise, it has an integrated memory controller. Two little steps, but steps that show changing parts of the overall architecture isn't impossible. I can imagine that 80 core chip 5 years out using something like photons to shuttle data around, with something equally bizarre yet already demonstrated technology like holographic memory for cache. In 5 years, perhaps we'll also be starting to see affordable, reliable solid-state disks.. at least within 10 years or so. And even further out past that 80 core chip, quantum computing gets advanced a wee bit with each passing year.

    So I suppose my thought is that the entire platform will grow up with these parts. Competition between AMD and Intel assures that technology will be advanced at all costs, because if one leaves some dusty old hanger-on like electrons and starts using photons, to significant advantage, then the other would assuredly be punished in the market. And like I said a moment ago, all those technologies has been a professors playtoy in some university lab for months and years, and some crazy things are well on their way to commercialization.

    Of course, the preferred path for the future would be to ask La Forge how the hell the Enterprise-E's computer cores operate, preferably complete with schematics. I'd say to ask Scotty, but unfortunately, he's in the Big Risa in The Sky, so to speak. Could ask, say, Tucker, but I'd rather that whole crew just go away and never have happened. He can keep his computer and 'phase pistol' tech to himself..

    Edit: Though Linda Park can tell me all she wants about communications any time.
     
  20. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    Good article (as usual) :thumb:

    For some strange reason I'm reminded of the introduction of the automobile: When they first came out they were very noisy, you had to be your own mechanic because they were unreliable, the roads weren't really built for them... Quite frankly the average person was better off with horses. The same is true today - by the time quad core is practical there will be a new CPUs & motherboards and the ones from today will be outdated - just like the original automobiles were outdated by the time cars became practical for the average person.

    Don't get me wrong, I and many of the people here, would get quad cores if we could afford them - just like many of us might have been early adopters of automobiles if we had been there then - but I currently wouldn't recommend quad core to most people nor would I ever (well, seldom anyway) recommend buying "for the future". - CDs and DVDs once were a "thing of the future" but those who waited got better players with more features at less than half the price of the simpler early models.

    So, here I sit with my dual core 64bit CPU running 32bit software that mostly uses only one core. My problem with pushing quad core is not in the quad core itself but in the untapped potential of the system I already have.
     
Tags: Add Tags

Share This Page