1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Intel says video encoding belongs on the CPU

Discussion in 'Article Discussion' started by CardJoe, 4 Jun 2008.

  1. CardJoe

    CardJoe Freelance Journalist

    Joined:
    3 Apr 2007
    Posts:
    11,346
    Likes Received:
    316
  2. bowman

    bowman Minimodder

    Joined:
    7 Apr 2008
    Posts:
    363
    Likes Received:
    10
    So, someone want to do a test of this and see if it holds any water? The quality is better?

    I have an AMD CPU, and one thing that makes me cringe is other AMD users defending themselves from Intel fans by claiming that AMD 'feels smoother'. It's patently ridiculous. And before I can see some sort of tangible proof of this, this seems like just that from a different perspective.
     
  3. amacieli

    amacieli What's a Dremel?

    Joined:
    14 Feb 2008
    Posts:
    93
    Likes Received:
    1
    nVidia is clearly encroaching on one of the key Intel marketing messages for its new multi-core procs. Of course Intel is going to hit back, even if what they say doesn't make 100% sense (although I believe that you'll still need a good processor to keep the GPU fed).

    With Photoshop soon to be nVidia-powered, and (gasp!) even a Folding@Home client on the way, let's judge the actual results.

    But think about it - Adobe's Photoshop is one of the best in its class, used by image pros everywhere - why would Adobe include nVidia acceleration if it produced poor-quality pics? Can anyone tell me why a Premiere accelerator would be any different?
     
  4. MrMonroe

    MrMonroe What's a Dremel?

    Joined:
    27 Dec 2007
    Posts:
    195
    Likes Received:
    0
    They are way scared.

    Intel's strategy through this entire fight has been to compare current nVidia and ATI technology to future Intel tech. (And to misrepresent current GPU tech to begin with) When they finally do release Larrabee, CUDA will have even more force behind it and even ATI cards will have jumped radically ahead of what Intel has projected.

    Long story short, Intel will succeed in making their products better at doing the things they are already good at, and they'll get nowhere near pushing nVidia or ATI out of their established markets. Too bad they will have wasted time trying.
     
  5. amacieli

    amacieli What's a Dremel?

    Joined:
    14 Feb 2008
    Posts:
    93
    Likes Received:
    1
    @MrMonroe: I don't think they're wasting their time - the vast majority of PC users use integrated graphics, and Larrabee will probably have the most impact in that segment. Why not bandy about comparisons with super hi-tech to make yours look great ("hey, little guy, the stuff we're pumping out compares well with what the hard-core gamers use!!"). But like you say, the day that Intel starts to overtake nVidia for regular gamers will be when a certain Biblical place becomes a magnet for skiing, if you catch my drift (no pun intended). Unless Intel just ups and buys nVidia (once it's had a chance to swallow its pride).
     
  6. chicorasia

    chicorasia What's a Dremel?

    Joined:
    8 Jan 2008
    Posts:
    84
    Likes Received:
    0
    Exactly. It is like claiming that current cars are nothing compared to future flying cars. Has anyone seen Larrabee-accelerated Photoshop? Or Premiere? Or Final Cut?

    Anyway, videos are made of pixels AND frames. If it splitting the frame into pixels yields "poor" results, why not send a whole frame to each stream processor?
     
  7. salesman

    salesman Minimodder

    Joined:
    29 Apr 2007
    Posts:
    234
    Likes Received:
    3
    Intel better do something amazing with statements like this.
     
  8. mclean007

    mclean007 Officious Bystander

    Joined:
    22 May 2003
    Posts:
    2,035
    Likes Received:
    15
    Photoshop editing is a different animal to video encoding. Applying a Photoshop transform or filter performs a specific mathematical calculation on each pixel. This is exactly what GPUs are excellent at doing. As stated in the article, video encoding requires two things, because it is a lossy compression process - first, you need to analyse the moving image to determine how best to deploy your available "budget" of bits per frame. Then you use that "budget" to apply various mathematical models to pixels. In other words, the different parts of the image interact more with one another than in a Photoshop operation, and Intel is suggesting that this makes the CPU a superior platform for encoding.

    That said, there is no reason why you can't use the CPU for the analysis and the GPU for the brute force required to run the compression algorithms afterwards. I don't see how Intel can justify a statement that the image quality of GPU encoded video will necessarily be inferior.
     
  9. badders

    badders Neuken in de Keuken

    Joined:
    4 Dec 2007
    Posts:
    2,642
    Likes Received:
    74
    Some clever dick will apply this well - a multi threaded process that runs on a mult-cored CPU, decides which parts of the frame need what bitreates. The encoding is then done in CUDA on the GPU, leaving the CPU free to check the next part of the frame.
     
  10. Zurechial

    Zurechial Elitist

    Joined:
    21 Mar 2007
    Posts:
    2,045
    Likes Received:
    99
    Not every developer knows C?
    I'm nothing more than an amateur coder, but I personally wouldn't imagine there are many professional developers who don't have a grasp of C..
     
  11. amacieli

    amacieli What's a Dremel?

    Joined:
    14 Feb 2008
    Posts:
    93
    Likes Received:
    1
    @mclean007 - your second paragraph is certainly what I had in mind - hits the issue on the head - and like I said, CPU will certainly be required too. On the other side of the coin, nVidia is also spinning way more than it should about GPU capabilities. GPU excels at many (but not all) math problems, but nobody's saying that the CPU is going to die in favor of the GPU.

    badders +1
     
  12. Bluephoenix

    Bluephoenix Spoon? What spoon?

    Joined:
    3 Dec 2006
    Posts:
    968
    Likes Received:
    1
    badders +2
     
  13. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    Shades of Windows 3 vs. OS/2 - tons of F.U.D. and little substance - I hope it doesn't turn out the same.
     
  14. Icy EyeG

    Icy EyeG Controlled by Eyebrow Powers™

    Joined:
    23 Jul 2007
    Posts:
    517
    Likes Received:
    3
    badders +3, indeed! :idea:

    How about GPULib, a library of mathematical functions for Very High Level Languages (Java, Python, MATLAB, IDL)?

    My guess is that more examples like this will follow, that will allow the use of CUDA with Very High Level Languages (I use Perl a lot, so I would definitely make use of a "CUDA extension"...).

    Moreover, here's a crazy thought: couldn't nVIDIA develop a Larrabee-like GPU using cores with ARM or Power architecture (since they don't have a x86 license)?
     
  15. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    or buy Via :naughty:
     
  16. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    lol why cant we all get a long and have both the cpu and gpu work together instead of fighting each other lol
     
  17. DXR_13KE

    DXR_13KE BananaModder

    Joined:
    14 Sep 2005
    Posts:
    9,139
    Likes Received:
    382
    is it me or intel has a bull's anus for a mouth... they are always talking BS....
     
  18. Haltech

    Haltech What's a Dremel?

    Joined:
    13 May 2008
    Posts:
    9
    Likes Received:
    0
    Summary of both companies
    Intel - Jack of all trades, master in nothing
    Nvidia - Master in Graphics and Parallel processing, dosn't do anything else

    Why cant there be a middle ground????
     
  19. Mentai

    Mentai What's a Dremel?

    Joined:
    11 Nov 2007
    Posts:
    758
    Likes Received:
    1
    So intel are losing a couple of major apps to gpu's. Big deal. People are still going to want/need fairly decent cpu's, they're probably not going to sell any less, it just means nvidia is going to sell more. Being that intel are directly invading nvidia's market space, and not the other way round, I don't see why they're always bitching about everything. Sheesh.
     
  20. C0nKer

    C0nKer What's a Dremel?

    Joined:
    25 Dec 2005
    Posts:
    329
    Likes Received:
    2
    Exactly. Engineers of all discipline, heck, even high school kids in India know C.
     
Tags: Add Tags

Share This Page