1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Other Whats the most we should expect form computing in the next five years?

Discussion in 'Hardware' started by rainbowbridge, 25 Sep 2009.

  1. rainbowbridge

    rainbowbridge Minimodder

    Joined:
    26 Apr 2009
    Posts:
    3,171
    Likes Received:
    69
    Every thing is always rolling along very nicely thank you very much with small increases of power and capacity and overall niceness.

    But its not Star Trex Holo deck is it?


    What do you think are fair expectantions of the computing market over say the next five years, do you expect to see real time ray tracing of massive scences, do you expect to be able to buy a gpu to mind transfer unit for vr, would you you like to see an overall size reduction of a compuer system to a box which does what you would expect it to do in five years time?


    What do you expect from computer systems in five years?


    Feel free to be blunt about about you expect and what you would perfer to see, ie what could be possible if certain companys worked together putting profit to the side (heaven forbid).


    Large steps in progress are possible if we dream. I hope they dream.

    [​IMG]
     
  2. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    I just hope nvidia is still around in 5 years XD with the way they are going- it's not looking too good
     
  3. Diosjenin

    Diosjenin Thinker, Tweaker, Et Cetera

    Joined:
    14 Jul 2008
    Posts:
    777
    Likes Received:
    54
    Yeah, I hope so too. I want the competition in the marketplace, and it looks right now like Larrabee won't compete on the level I first thought it would (not that I really want Intel getting itself into one more hardware market anyway). That leaves GT300, and who knows what's going on with that. Plus I have a friend who just graduated last spring who's working there now, and it would suck for him to lose his job so soon after he got it.

    I guess I can see them declaring bankruptcy, maybe a leadership shake-up, but a company like nVidia going under I don't think I can see happening. Heck, maybe they can just go to the feds and say "we go under, 55% of the computers out there have no means of support for their graphics chips" and get a bailout. :hehe:


    Huh. By the end of 2014/early 2015...


    - Graphics: I think we may very well see the first raytracing-capable cards, but hopefully not discrete cards. My guess is that, since GPUs have been moving towards greater functionality anyway (up until now it's been things like tessellation, limited x86 programming capabilities, etc.), we'll first see it popularized by GPUs that can both rasterize and ray-trace. Whether there will be a generation or two around already by 2014/15 or whether they're just starting to come out by then will be an interesting call; the first thing that will need to happen is a unified software architecture - either integrated with or separate from DirectX and/or OpenGL - so the first step is that MS needs to get on the ball on that front.

    In either case, that would leave developers free to at least start writing games with raytracing engines that they know would then be compatible with anyone who owned a dual rasterizing/raytracing-capable card. I think that's the only way you're going to see raytracing move forward on a large scale - particularly the dual-capable GPU step. You need that backwards compatibility. Otherwise, you'll see third-party companies jumping in and a mess of hardware and software incompatibility the way it was in the early days of GPUs. Hopefully we're above allowing such an ugly technology shift by now.

    I know Larrabee is already capable, but software compatibility is still in question, and hardware compatibility is an issue regardless. If people can only raytrace on a Larrabee, you won't see one game that will use it. Whether nVidia will still be around in five years, I don't know - I hope they will be - but you can bet ATI will be, and when everyone except Intel has cornered virtually all of the gaming market regardless, nobody with a brain in their head will write a game that can only play on a product with <1% market share, much less 10% market share. But if ATI/nVidia jumps on board, and the software calls are compatible across the board, yeah, you'll at least see that some developers will start taking advantage of the new capabilities.

    EDIT: It occurred to me that you'll really see it get kickstarted when a next-gen console has a raytracing-capable GPU - especially if that console is Microsoft-made, since they seem to have the greatest number of PC ports. Whether that will happen with the next generation of consoles or the one after is harder to tell.


    - CPU: The first 8-cores will be here in two years, that's not even a question. 16-cores, almost certainly within 5 years. 32? Harder to say. Both Intel and AMD are putting a lot of energy into integrating graphics chipsets onto their CPUs, so that might slow down the core arms race long enough to delay 32-core CPUs beyond five years.

    One thing that will have to come out soon is a tool of some kind that allows software engineers an easier way to program multiple threads. It's not so bad when you're up to four, or when you're running a task that's fairly parallel anyway (like rendering). But balancing a workload between eight, sixteen, thirty-two cores manually? It gets insane.

    This is what I really see as the next big jump in software engineering. First there was the jump from assembly to higher-level languages, then there was the jump to object-oriented programming, and now (I think) the next big one is going to be automated or semi-automated multithreading. That has to come in some form or another within the next ten years, because the alternative is to manually balance upwards of 32, 64, 128(!) threads. At some point, a human just can't do that anymore - and the core race becomes utterly meaningless if someone can't find a way for a machine to do it for them.

    Architectural specifics are a harder thing to predict. Beyond Westmere on Intel's side (like Sandy Bridge) and whatever the new non-Fusion cores are on AMD's side (like Bulldozer), we don't really know much of anything about the specifics of what's coming beyond when the next die shrinks are supposed to hit. The big pushes will be in integration and power usage, though - all a result of more people continuing to move to laptops, which means they need to push for more battery life, etc.


    - Storage: A 250GB SSD you can actually afford. :hehe: No, seriously...

    Well, we certainly will start to see affordable SSDs, that's for sure. Given that they've (roughly) halved in price for a given capacity per year or so, by 2014/15 it's the 1TB SSDs that will be in the $350 range. So no, they still won't be at current HDD price per gigabyte levels, but they have well over an order of magnitude to traverse in that area - the fact that they can almost get there in five years is an achievement in its own right.

    My guess is that you'll start seeing them integrated in "Best Buy machines" on a large scale within two years, and beyond three or four years they'll be nearly ubiquitous. Beyond five? Frankly, it's hard to say if they'll still be around in any respectable capacity at all.

    The march of SSDs towards their future as the storage medium of choice is inevitable. The more interesting question is where the major HDD manufacturers will be by then. Seagate and Western Digital have both purchased some SSD players (and Samsung is a player), so they'll still be around, but I think they'll probably be relegated to relative obscurity in the field unless they a) strike a deal with OEMs, or b) make their own controllers, and make them competitive with Intel. Fujitsu and Hitachi are both conglomerates, so they may exit the market, but it's not like they're going under...


    - Other (just for fun):

    > USB 3.0 will be ubiquitous. People will have already started complaining about the transfer speed.
    > Dell's 24" 16:10 monitors with IPS panels that everyone so lusts after will still cost upwards of $700.
    > Popular OEM cases will have started coming out in series, just like AAA game franchises. Five years from now, for example, Antec will have the 903 and 904, the 302 and 303, and the P184 and P185.
    > The new game consoles will be mature:
    >> If Sony has no death wish, then the PS4 will have a launch cost of $400 max. They will be obsessed with motion control after getting their ass handed to them so badly by Nintendo, but they'll never really get their developers to use it as thoroughly as they'd like. It won't matter, though, because nobody cares about motion control on a non-Nintendo console anyway.
    >> Microsoft's new Xbox will not be titled the 'Xbox 2Pi,' which will make me a sad panda. It will launch with a new Halo game (which Bungie will again swear is their last). They will be obsessed with motion control after getting their ass handed to them so badly by Nintendo, but they'll never really get their developers to use it as thoroughly as they'd like. It won't matter, though, because nobody cares about motion control on a non-Nintendo console anyway.
    >> Nintendo will come up with something very unique and a little weird, everyone will wonder if it will sell, and it will sell like hotcakes, because they're Nintendo, darn it!


    - Diosjenin -
     
    Last edited: 25 Sep 2009
  4. she'shighvoltage!

    she'shighvoltage! What's a Dremel?

    Joined:
    24 Sep 2009
    Posts:
    74
    Likes Received:
    0
    ^No one can top that.
    +rep

    Also I want goodlooking, functional single slot coolers.
     
  5. she'shighvoltage!

    she'shighvoltage! What's a Dremel?

    Joined:
    24 Sep 2009
    Posts:
    74
    Likes Received:
    0
    ^No one can top that.
    +rep

    Also I want goodlooking, functional single slot coolers.

    FFF, doublepost.
     
    Last edited: 25 Sep 2009
  6. Journeyer

    Journeyer Minimodder

    Joined:
    31 Aug 2006
    Posts:
    3,039
    Likes Received:
    99
    But the question is; will we have PC cup holders that won't break under the weight of the cup by then?
     
  7. Burnout21

    Burnout21 Mmmm biscuits

    Joined:
    9 Sep 2005
    Posts:
    8,616
    Likes Received:
    197
    5 years is a little too short a time frame for big changes.

    Graphically we are near to photo realism, so i would expect this to come around very soon.

    SSD storage will become more mainstream as vendors increase production which drives down the cost. Much like how SD memory has tumbled in price.

    I think the consumer level CPU will stop at 8 cores, i would be very surprised if it ever went any higher. Altho i cant see many normal houesholds requiring 8 cores to surf the net and use MS word!

    Nettops will be every where, with 1080p playback thanks to the likes of Nvidia with ION, and thus AMD's own development into low power CPU's like the Atom. But AMD will offer the whole package at a greatly reduced cost.

    I hope blue-ray comes down in price much like DVD's did, so are little media centres now mostly AMD based will have full uncompressed audio and hi-def video playback at little to no extra cost.

    Home networking is already bursting and media servers have already appearing from the likes of HP, and further developement with the likes of FreeNAS.

    So are desktops will be SSD boot/app on a small scale, but are main bulk of files on a central household server, couple O terebytes.
     
  8. Diosjenin

    Diosjenin Thinker, Tweaker, Et Cetera

    Joined:
    14 Jul 2008
    Posts:
    777
    Likes Received:
    54
    But it has to go higher, or neither Intel nor AMD has a future. That's why we have multicore consumer CPUs at all - you can only get so much performance out of a single core. I'm not at all saying the average consumer will need them, but the CPU makers have to push something, or they go under.
     
  9. Horizon

    Horizon Dremel Worthy

    Joined:
    30 May 2008
    Posts:
    765
    Likes Received:
    10
    Low-power computing it pretty much where it's going to be at in 5 years. Powerful cpus that sip power with a pinky in the air.

    Hope fully atom cpus will be a drop in solution and not a cpu/mobo affair.

    Either Nettops will be in full swing or DFI has caught onto the next big thing

    With AMD/INTEL moving the IGP onto the CPU, Nvidia's chipset business will be in shambles, ION(X) will go up in smoke and listed as and endangered species.
     
  10. smc8788

    smc8788 Multimodder

    Joined:
    23 Apr 2009
    Posts:
    5,974
    Likes Received:
    272
    In 5 years I expect the same sort of progress as has been made since 2004.

    You only have to look at Intel's roadmap to see where the CPU market is going, but I also expect more evidence of a move towards cloud-based computing, although we're still a long way off that at the moment.

    As for the graphics market, that seems a bit more volatile at the moment so I'm not sure. I don't think you can say ATI are safe and nVidia are in trouble, because that would just be going on the current situation and ignoring past evidence. If anything, I could see ATI going down the pan before nVidia do, especially with AMD seriously on the back foot and losing an increasing proportion of market share to Intel.
     
  11. Dosvedagna

    Dosvedagna Justice!

    Joined:
    22 Sep 2009
    Posts:
    65
    Likes Received:
    2
    i think at the rate that technology is consistently changing and improving we could see some pretty cool features come into play.

    i mean iv heard rumours into the fact scientists have successfully generated fully immersive holographic and 3d images that project off a screen behind a projector - i mean one day that techs going to be mainstream right?

    how amazing would that be?

    realistically though i see the entertainment theatre merging, people are lazy and they like all their eggs in one basket, im sure you may have seen the prototype builds for those coffee table computers that microsoft has been developing and it looks impressive.
    I can quite easily see a computer in a decades time (maybe not 5 years), that will be everything you need, your tv, phone line and internet running through it, all in your living room.

    Theres also some interesting developments going on the internet into how people interact with computers, i mean think how conventional it is to use a keyboard and mouse, theres people out there, trying to conceive other ways to use a computer - and some of them are really cool.
    One which i saw on bt yahoo news a couple 'o' months back was a holographic projector of a sphere, that detected your hand movements due to how you were disrupting the light projections from its emittor, and moved the mouse on the screen accordingly.
    for example if you squeezed the holograph it was the equivalent of a click.

    theres some seriously cool stuff being developed out there - i cant really guess at which direction it will head in.

    im just a technology nerd whose excited at the prospect of it, all the stuff iv read so far here is immensely cool to think about wouldnt you say?
     

Share This Page