1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Intel says video encoding belongs on the CPU

Discussion in 'Article Discussion' started by CardJoe, 4 Jun 2008.

  1. desertstalker

    desertstalker What's a Dremel?

    Joined:
    7 Feb 2004
    Posts:
    73
    Likes Received:
    0
    Dont be so sure, My uni just changed their 1st year CS stuff to python....
     
  2. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    interesting info!
     
  3. Cthippo

    Cthippo Can't mod my way out of a paper bag

    Joined:
    7 Aug 2005
    Posts:
    6,783
    Likes Received:
    102
    It's going to take nVidia about a week to come up with a CUDA video encoder which will make Intel look pretty stupid. The bit about quality is pure FUD because that's going to come down to how the software is written.
     
  4. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,509
    Likes Received:
    503
    come on! C is the most basic language, how can any computer/embedded systems/hardware developer engineer don't know C? how can they survive?

    although the accuracy point is taken. BOINC distributed computing don't use GPU is because they say it's not accurate enough to produce reliable results.
     
  5. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,213
    Likes Received:
    1,286
    Not knowing C? We get to learn C at the earliest in 5th semester over here. First is Scheme then Java. If you're interested you can learn all those web-based stuff as well but the main language we're taught is Java.
    [/offtopic]

    I think this whole "battle of encoding" is pure marketing. We'll see who'll come out as a winner in this one but I have a feeling that both are right... somewhere... a bit... maybe. :)

    Oh, and badders +4. ;)
     
  6. yakyb

    yakyb i hate the person above me

    Joined:
    10 Oct 2006
    Posts:
    2,064
    Likes Received:
    36
    afaic CUDA and video encoding via cuda is in early stages of development in comparison to the seasoned encoders we have for CPU heck the DivX codec is ten years old and people have writing encoders for that time

    as HDD sizes increase people will be asking for greater quality (therefore less compression) and could this be a realm for GPU or CPU who knows i personally beleive that the GPU is the next major leap and as soon as few freeware (along the lines of staxrip or handbrake) encoding apps are released that utilise CUDA Nvidia will start to win out.
     
  7. ecktt

    ecktt What's a Dremel?

    Joined:
    10 Jun 2008
    Posts:
    1
    Likes Received:
    0
    Sigh...
    If NViadia GPU has all the branching power, then they're wasting silicon that would have better been utilized for pushing pixels or massively parallel calculations. Intel is right in that a general processor is much better suited to branching. How much of that is required for video encoding is beyond me because I've not written any code for that type of application. I do understand Intel's point about IQ of the video as I have some understanding about what goes on during video encoding/compression. That said, any programmer worth his salt know ANSI C.
    Anyway, a Hybrid approach to video encoding utilizing both types of processors is probably the best approach. The CPU for the branching and the GPU for the parallel calculations. Someone asked why not have a processor hybrid. Well you have to ask how much silicon would have to be partition for matrix type calculations and how much for branching. For every type of task the answer would be different. As is, CPUs have some degree of Single Instruction Multiple Data (SIMD). Intel calls it SSE/2/3/4/... and AMD called it 3D NOW. I think AMD adopted SSE a while ago. My point here is while Intel might be full of marketing, what they're saying isn't necessarily BS and their processor have been capable of doing similar type calculations to a GPU for quite some time (even before GPUs were marketed, although no where near as fast).
     
Tags: Add Tags

Share This Page