1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Alienware Releases Area 51 and Graphics Amplifier

Discussion in 'Article Discussion' started by Combatus, 29 Oct 2014.

  1. Combatus

    Combatus Bit-tech Modding + hardware reviews Lover of bit-tech Super Moderator

    Joined:
    16 Feb 2009
    Posts:
    2,761
    Likes Received:
    89
  2. Guinevere

    Guinevere Mega Mom

    Joined:
    8 May 2010
    Posts:
    2,484
    Likes Received:
    176
    I love the amplifier idea, but wouldn't it be wonderful if the proprietary connector wasn't proprietary. After a few darker years it sure looks like PC gaming is here for the long haul. It's about time there was a universal solution to connect the two opposing 'wants' of computing.

    A) A lightweight laptop with long battery life.
    B) A rig capable of gaming with decent res/detail.

    Tech like this could be it. A laptop like a 13" Air with 12hr mobile battery life, paired with a decent GPU at the desk to drive that 4K screen.

    My laptop has more than enough CPU grunt to game at silly resolutions but for many titles the mobile GPU won't push things to the max. My desktop has mid range GPU and 5yo i7 and is still more than good enough.

    I want to glue them both together with gaffa tape and velcro... maybe it'll be possible one day.

    But my laptops are always Apple's so this will never happen.
     
  3. Icy EyeG

    Icy EyeG Controlled by Eyebrow Powers™

    Joined:
    23 Jul 2007
    Posts:
    517
    Likes Received:
    3
    Exactly. A Standard is really needed for this. Maybe USB 3.1 will change the landscape, does anyone know we'll be able to use it for driving GPUs, or PCIe cards in general?
     
    Last edited: 29 Oct 2014
  4. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    Thunderbolt -> ExpressCard
    ExpressCard -> Pci-e adapter
    Bootcamp to use Windows

    Far from the prettiest solution, but possible
     
  5. Hustler

    Hustler Minimodder

    Joined:
    8 Aug 2005
    Posts:
    1,039
    Likes Received:
    41
    3 graphics cards...don't know about that.

    I though SLI & Crossfire really starts to get quite troublesome once you go past 2 cards?
     
  6. Dogbert666

    Dogbert666 *Fewer Lover of bit-tech Administrator

    Joined:
    17 Jan 2010
    Posts:
    1,678
    Likes Received:
    181
    Situation has improved, but three cards is even more reliant on driver optimisations, and the scaling won't be as good as it is going from one to two cards.
     
  7. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    That kind of got me thinking (painful) if the Alienware Graphics Amplifier is nothing more than a PSU and a case to house a graphics card, would it be possible to drop the need for a PSU by piggybacking a graphics card on the supply feeding a monitor ?

    IDK what voltage your average LCD uses, or even if they all use the same voltage internally.
    I just have a picture in my head of a simple housing on the back of monitors that can have a graphics card installed.
     
  8. Guinevere

    Guinevere Mega Mom

    Joined:
    8 May 2010
    Posts:
    2,484
    Likes Received:
    176
  9. Guinevere

    Guinevere Mega Mom

    Joined:
    8 May 2010
    Posts:
    2,484
    Likes Received:
    176
    Any displays internal PSU will be big enough to drive the display and integrated USB hubs but nothing as beefy as a GPU. It would theoretically be possible to run a display and GPU of the same PSU, but it would be custom all the way. It would be easier (much easier!) to run two PSUs and probably no less efficient if the GPU wasn't always needed to be on.
     
    Last edited: 29 Oct 2014
  10. Icy EyeG

    Icy EyeG Controlled by Eyebrow Powers™

    Joined:
    23 Jul 2007
    Posts:
    517
    Likes Received:
    3
  11. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    2,550
    Likes Received:
    467
    As I said when I saw this linked on Overclockers it is a great idea in principal, but once again the actual execution is flawed.

    At least it is better than the MSI attempt which required the laptop to be plugged in on top of the external GPU "box" (making an external monitor keyboard and mouse essential).
     
  12. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    Nvidia has made massive advances in power efficiency with maxwell, it probably wouldn't take too much crazy engineering to integrate a GTX 960 (which will hopefully come soon) into a monitor.
    Of course it would need to be easy to replace, I'm thinking along the lines of a simple flap that open with a single screw, kind of like to access ram slots in a laptop. To keep size and compability in check there would need to be a pcb size standard as well, I'd say a pcb size like this card has should work.

    The real trickery would then be in the software on the laptop which would have to correctly identify and control that gpu, as in switch it on and off depending on load...
     
  13. fluxtatic

    fluxtatic What's a Dremel?

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    Would you want to game on a laptop keyboard without a mouse?

    It doesn't much matter to me, as I have no interest in laptops anyway, but I have hard time understanding why it's not until the past year or couple, maybe, that some manufacturers are trying to get this sort of thing together. I mean, if there's enough of a market out there for AMD & NVidia to release video cards for $1500, surely there's a larger market that wants what the Graphics Amplifier is offering.

    Maybe if AMD was in a better financial position, being that they're the only all-in-one company that might have an interest in doing this, if they could get the OEMs on board.

    The interface would be the trick, and NVidia and Intel might not play nicely enough together to make it happen. Intel doesn't really have a compelling interest in it, and NVidia likely couldn't do it alone. Or maybe Thunderbolt is the way forward, if it ever comes to be that the interface and peripherals don't command a stupid premium.
     
  14. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I'm not sure you're looking far enough into the future fluxtatic.

    IMHO there will come a time when mobile phones and/or tablets are powerful enough for most peoples needs, some already are. The problem is mobile devices come with limitations such as smaller screens, limited power, and often lacking full K+M.

    What people want (IMHO) is the best of both worlds. A device that can be carried around with them and at the same time be super charged by plugging it into a bigger screen with a full sized K+M, and extra graphical processing power.

    People would no longer need a full sized desktop PC, they could plug their mobile device into a system that gives them a bigger screen, a full size K+M, and more processing power.
     
  15. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    2,550
    Likes Received:
    467
    I just plug a mouse into the side of my gaming laptop if I want to game on it...
     
  16. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    Don't really see an alternative to using Thunderbolt (btw version 3 with doubled bandwidth is due to come with Intels Skylake next year).

    What would be required would be a thunderbolt input on the monitor, then on the inside a thunderbolt -> pci-e adapter with an option to pass through the signal coming from the laptop directly to the monitor for when the GPU in the pci-e slot is not in use, if it is the monitor then needs to get his signal from the displayport on the gpu.

    Then the laptop needs to switch between its own gpu and the thunderbolt connected one automatically depending on needs, which could end up very tricky to achieve depending on how cooperative Intel / Nvidia and Amd would be with each other on the topic:D
     
  17. jon

    jon Chief Phrenologist

    Joined:
    26 Aug 2009
    Posts:
    163
    Likes Received:
    3
    Easier to just make a docking station (dare I say it, a *universal* docking station) complete with KVM connections, that also houses a GPU. The data connection needs to have bandwidth wide enough to handle passthrough of the data between the CPU and the GPU, and then pump the video out from the docking station to the monitor. That means the laptop needs that data connector (T-bolt 3?) as well. The graphics card can act like any other graphics card and pass DVI / HDMI straight to the monitor.

    And while we're dreaming (might as well dream big, right?) put spare HDDs in there, make it an always-on (not the GPU, just the drives) feature where it acts like a NAS when the laptop is unplugged, but becomes second / third HDDs while the laptop is docked.

    What other features do we want? And which manufacturing is listening? Because this is where mobile and desktop merge.

    -J
     
  18. Teknokid

    Teknokid Minimodder

    Joined:
    6 Dec 2007
    Posts:
    1,095
    Likes Received:
    26
    @ Jon - why not just have a decent NAS?

    If speed is a concern, then with a decent NAS it shouldn't be. I get over 100mb/s from my synology Nas, which is pretty close to, if not the same as, the speeds I get from mounting them inside my pc...
     
  19. Splynncryth

    Splynncryth 0x665E3FF6,0x46CC,...

    Joined:
    31 Dec 2002
    Posts:
    1,510
    Likes Received:
    18
    Thunderbolt in in essence, a PCIex4 link. If you could bond 4 of them together, you could get a x16 slot's worth of bandwidth if the silicon used to convert PCIe to TB would allow it.
    External GPU enclosures for TB have been demoed, but reasons for not bringing a product to market range from hot plug issues to lack of support on laptops (though this last one is starting to become less of an issue).

    I would not be surprised in the graphics amplifier is basically a Thunderbolt enclosure and perhaps a clever hardware hacker can get it running on a generic thunderbolt port.
     
  20. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,062
    Likes Received:
    970
    No need for it, http://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/21.html

    pci-e x4 is sufficient (and tb will double its bandwith next year, so it will be pci-e x8).
     
Tags: Add Tags

Share This Page