1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blogs Nvidia: Remember who got you where you are today

Discussion in 'Article Discussion' started by Cutter McJ1b, 12 Jan 2010.

  1. Guest-16

    Guest-16 Guest

    Where do you think every GT200b in the last few months has gone? Tesla cards. Or, so I hear from a few industry peeps.

    Nvidia Sterescopic 3D is a gimmick. It encourages people to use poor TN panels with poor quality and gets people onboard with the novelty effect. You can't game for hours in it, it gives everyone that has tried in the office a headache. There is a Stereoscopic 3D kit sitting in the lab and no one ever uses :p

    I'm not saying 3D has a general rule of thumb is arse - people love Avatar for example, it's just I strongly don't believe Nvidia's locked down propitiatory solution is right for our "open" PC market.

    AMD's lazier approach is not the right answer either, but whatever happened to interoperability?

    SNiiPE_DoGG: I hear you and I agree, but what I'm saying is that the change is unceremonious and feels somewhat underhanded, which is a shame. I feel sorry for the companies that have grown up around Nvidia to support them.
     
  2. SNiiPE_DoGG

    SNiiPE_DoGG Engineering The Extreme

    Joined:
    14 Apr 2009
    Posts:
    533
    Likes Received:
    39
    I don't agree with that at all (DB), the money is not in the high end gpu market anymore. thats a fact.

    I meant there are two clear paths here:

    TEGRA: spend a lot on R&D --> manufacture for relatively small cost --> sell in unit volume of tens of millions for good profit.

    Highend consumer GPU: Spend a lot on R&D --> Manufacture for high cost --> Sell in unit volume less than 5 million for slim profit margin.

    It's pretty clear from this simple comparison that the money is not in in high end VGA at all anymore. Not to mention the points I listed in my previous post.

    EDIT: Bindi - yes I know it is really a shame that they hang them to dry like that :\ especially after so many years of passing all of the CS and RMA off for them to handle the idiot masses
     
  3. Guest-16

    Guest-16 Guest

    Oh I agree totally. But by this argument how long before performance PC gaming is in the *******?

    In the end we will have a scenario where Nvidia could end up making an architecture for consoles, then ship the same derivative for the PC market and keep it until a new console arrives. Microsoft/Sony pay for the development, then the PC market is farmed off as a second thought that will always buy something?

    Also, I do believe I successfully provoked a discussion today: Win me :cooldude:
     
  4. DarthBeavis

    DarthBeavis New Member

    Joined:
    29 Aug 2008
    Posts:
    480
    Likes Received:
    30
    Did I say money was in the high-end market? read my post again. I said "The casual gamer is not even the demographic where the dollars go" meaning the dollars are not in the enthusiast or casual market. The threshold is even lower than casual. Most PCs sold have GPUs that really don't even meet OUR definition of casual gamer. That is not to say the high-end is dead as Nvidia also has their powerhouse and cash cow Tegra and Tesla. The advancements in these two markets will transfer to the Geforce line granted not immediately.
    Bindibadgi: I probably have come across many more people than you in terms of using this technology. I will go with my experiences on this one. The factor holding back the technology has been a diversity of 3d displays. That is changing this year.
     
  5. Guest-16

    Guest-16 Guest

    Fair enough mate, but I still don't agree that the fundamental 3D technology will change things for many more years in the general public. TVs for movies, maybe, but gaming, I strongly doubt it.

    Personally I find it a fad an would invest in a quality, larger panel, however I realise that more people like the "idea" that something is better, like a cheap HDTV, when it actually isn't. Marketing, not good quality sells products at the end of the day.

    I've yet to really play a "3D" title that looks like it's anything more than layered 2D that has one trick: to throw things out the screen.

    Also: will it be a case that we stop evaluating PC components on their own and in comparison to one another given the push to propitiatory standards? We already have an opening scenario where you must purchase one type of graphics card to play one game in full: Batman for PhysX, Avatar for Stereoscopic 3D, and in some respects, STALKER for DX11 - although that will change soon.

    For all Intel's, or AMDs wrongs, at least they got it right with things like PCI, AGP, PCI-Express and HyperTransport..

    Also: Tegra is taking products from GeForce line, not the other way around ;)
     
  6. DarthBeavis

    DarthBeavis New Member

    Joined:
    29 Aug 2008
    Posts:
    480
    Likes Received:
    30
    Bindibadgi I understand the Batman deal and the genesis of that issue is debatable (I have heard different people describe what happened with differing explainations as to who caused the issue). You are dead wrong about the Avatar game. I also have worked with Iz3d (in fact I worked with them while I worked with Nvidia on a project BEFORE Nvidia did their own 3d and almost had Iz3d lined up as the 3d component - then Nvidia had me go another route for obvious reasons). Iz3d told me both they AND Nvidia has been included in the 3d development for Avatar. NO proprietary action there. Both Iz3d and Nvidia will be at PDXLAN 15 this weekend and will have 3d demos up and running.
    I love competition and have done projects for Intel and AMD/ATI - I support all three companies.
    I think it does not really matter which direction the technology flows in the vertical Nvidia chain so long as it does flow in whatever direction benefits the end user.

    I agree 3d is not the end all, it is a value added feature. All the monitor vendors and even DirectTV back me up on that assessment ;)
     
  7. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    3D TVs using IPS displays still have the same issues as the TN panels. Namely, causing severe headaches and even nausea. I was speaking off the record to a number of execs at TV manufacturers during CES and the vast majority agreed with me that 3D is a fad, but it's here and so they've got to push it. They're just hoping that, with a united push, consumers will buy into it because it'll increase ASPs for both hardware and content (I'm mainly talking TVs/3D Blu-ray here rather than games).

    Every seasoned journalist I speak to says the same and it's not as if I'm new to this game either.

    3D works in the cinema because the big screen fills your peripheral vision . A 50in TV does not, quite simply, and if you combine that with the glasses that refresh at 60Hz, it's a recipe for unpleasantness. It's like using a 60Hz monitor all day, only worse because the sense of depth around the edge of the screen is completely fubared and you end up getting a headache, feeling disorientated and looking like a tw*t. ;)
     
  8. SNiiPE_DoGG

    SNiiPE_DoGG Engineering The Extreme

    Joined:
    14 Apr 2009
    Posts:
    533
    Likes Received:
    39
    You hit the nail on the headache.
     
  9. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    If it's a value-added feature, why are some television manufacturers only bundling one pair of active shutter glasses? Pay more for the TV (because it's got 3D) and then have to pay more again for an extra three pairs of glasses (Nvidia's are £115 a pop) because your wife and two kids want to sit and watch TV together with you. Sounds like a value-add to me.

    DirecTV will say it's a value-add because they can (and probably will) charge more for their subscription service in 3D.
     
  10. Guest-16

    Guest-16 Guest

    Because they've got to sell something now LCD prices are plunging and you get 24" monitors free in cereal packets these days.

    I didn't see this the first time:

    You are financially invested in it, so yes, I think you will try to push it as a benefit ;):)

    When I can watch 3DTV without having to sit there with a headache or stupid, expensive glasses - I'll be the first in the queue.
     
  11. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
  12. retrogamer1990

    retrogamer1990 What does 'Stock' Mean?

    Joined:
    12 Jan 2009
    Posts:
    319
    Likes Received:
    7
    Okay, so having read this article and discussion, it seems like perhaps nVidia have changed direction and arent focusing on performance graphics anymore. Instead, they are focusing on their GPGPU systems and mass market consumer products, while putting gaming on the back burner.

    On one hand, this is quite possibly a genius move from the company's perspective, for several reasons it definately makes sense. With the rise of GPGPU applications predicted (albeit by nvidia themselves), the users already demanding high performance / server computing could switch to nVidia's Tegra platform if they can engineer a high performing product. I beleive this is a rather large market.
    Also, If nVidia continue to develop GPGPU programming, ala CUDA, they could expand this market into other fields, possibly breaking into general, everyday areas of computing. This would create significant advantages for nVidia over competitors if they own the rights to the technology that becomes mainstream.
    Obviously they are also targeting the more mainstream mass market (casual gamers) too, look at the high volume of GT250 GPU's as outlined in the article. Am I right in thinking these are the cheapest CUDA enabled GPU's too? This could work well for nVidia if they take advantage of the above theory. Don't forget about consoles either, the next-gen consoles will be capable of far more than gaming, my opinion is that they will turn into HTPC's with gaming as an add on.

    On the other hand...it creates a distortion in 'our' market. Enthusiasts.
    If nVidia, quite rightly, go where the real money is and leave AMD/ATI to gaming graphics, where is the incentive for progress? We may never play Crysis at 60FPS people! Competiton is clearly an issue for companies, but for consumers, it is highly beneficial. Innovation, progress and ultimately lower prices are driven by competiton. If nVidia reduce their presence in the market, or even if ATI follow suit, where does this leave us? We may have few sub-par, expensive gaming cards to play with.

    well, that's my two cents anyway, I'm off to bed.
     
  13. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    As long as people go back with the PC route(sadly most can't think that way) it's fine, that said I miss the days of 2004-2007, when you had GPU battles..People actually could take sides as every GPU broke the limit of the previous generation by quite a bit.
     
  14. Horizon

    Horizon Dremel Worthy

    Joined:
    30 May 2008
    Posts:
    765
    Likes Received:
    10
    Ughh :Shudders: I need to go lay down now.
     
  15. ragman

    ragman New Member

    Joined:
    21 Sep 2005
    Posts:
    2
    Likes Received:
    0

    We are already there with software/games, hardware is just a matter of time.
     
  16. SNiiPE_DoGG

    SNiiPE_DoGG Engineering The Extreme

    Joined:
    14 Apr 2009
    Posts:
    533
    Likes Received:
    39
    While I love PC gaming as much as all of you and I am not now, but once was, extremely hardcore about PC gaming; I'm going to play devils advocate here.

    What if the future of gaming is not high res moddable games that require the 3,000 dollar computer to play? What about a unified resolution between consoles and PC? - 1920x1080 for now but of course it could be raised in the future (2560x1440).

    I can see quite a few benefits to this model (for arguments sake I wont put in the drawbacks), It could make out machines draw less power, run cooler, and ultimately be cheaper as we reach that threshold of performance (we are already there for this gen), developers might put more games out on the PC, PC gaming could become more accessible not necessarily to idiots but to lower income brackets.

    I see a few benefits that would be good, not that I think its the outcome I want but IMO its better than the death of gaming computers all together.
     
  17. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    The only problem is that console developers know that making a new console now is unprofitable.

    If we could go your medium, I wouldn't mind at all seeing as everyone wins, well except people who like 16:10.
     
  18. PureSilver

    PureSilver E-tailer Tailor

    Joined:
    16 Dec 2008
    Posts:
    3,152
    Likes Received:
    235
    Aren't you based in London? Tell ya what, I'll swing by at Easter when I get back, you can give it to me, and I'll try it for a couple of months. I'll even indemnify you against claims for damages write you a review at the end of it.

    Whaddya say? :D
     
  19. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    the whole 3d thing is a gimmick - i saw Avatar recently and thought 'my eyes hurt', it didnt even seem as though 3d was vital to the film. I saw Coraline a while back and there were only TWO 3d effects in the whole film.

    I did laugh when nvidia pushed their propietary 3d format as a standard. It made me think about the other times they have pushed 'standards' which only benefit themselves and hurt consumers ie CUDA, PhysX. In the case of PhysX they even blocked using an nvidia card with an ATi primary (which was probably the only time i was going to give money to nvidia).

    This also made me laugh:

    some cash cow it is, MS have sold millions of Zune HDs after all ;)

    Nvidia are looking more and more likely to pull out of our beloved enthusiast market, and personally i dont trust ATi as a single-horse after the HD5000 pricing (although to be fair it wasnt all their fault).
     
  20. SNiiPE_DoGG

    SNiiPE_DoGG Engineering The Extreme

    Joined:
    14 Apr 2009
    Posts:
    533
    Likes Received:
    39
    Lol your calling ATI's 5k series pricing bad? someone clearly doesnt remember the launch of the GTX 2XX series :laugh: $800+ for the watercooled gtx 280!
     
Tags: Add Tags

Share This Page