Discussion in 'Article Discussion' started by bit-tech, 12 Feb 2018.
That’s exactly what I’m waiting for... a new laptop/ultrabook.
Aye, I'd be very interested to see how the 35W parts fare - still quite thirsty for a laptop but let's hope the performance per Watt is worth it
I use my PC for DAW work, and DAW´s have quite a comprehensive GUI. Your old hardware mixing console is replaced with a virtual one, with hundreds of dials and faders. It´s a pain in the b... to work with, if there is input lag or stuttering when you turn these virtual knobs. And in the editing phase, it´s even worse, because a tool you use constantly is the zoom tool. And if you have, say, 32 tracks of audio, all represented graphicaly, and the GPU/iGPU is not up to the task, zoom-in and zoom-out is slooow, stuttering and choppy.
Could you add Cubase or a similar application to the bench-suite, to see how APU´s fare in that regard?
Or is there any of your tests, that would be equivalent?
I have an i7 3770 now, have you got an idea, how the CPU-part of the 2400G performs compared to that?
Performance wise it still trucks along especially recording TV, Plex and Gaming with the 750TI currently in the system. It feels like processor performance has stagnated while power draw has been the big area of development. What I can gather an i3 5xxx was starting to become faster than the phenom II 955 in multithreading but certainly not worth the platform costs (DDR3/4 Motherboard and Processor). Second hand parts you could make the argument but why change when Phenom II still works and works well? SATA3 is the only thing I can think of that is really needed platform wise.
Great progress but I just want a little more!
Ideal sweet spot would be around the 1050ti performance in 1080p then I'll be telling and to take my money.
Imagine that level of performance in a stx package!
Speaking of more performance, is hybrid crossfire still a thing or has it died/been left to developers to implement?
Then you'll pretty much need to wait for the Intel + Vega collab chip, since that includes a stack of HBM2 the GPU doesn't have to rely on the system ram, pretty safe to assume it will find its way into products like Zotac Zbox, Gigabutt Brix and such.
1050 Ti is ~100W itself. I guess if they wanted to make a 140W APU...
I've been wondering about that also, how do this chips fair when used with DX12 or Vulkan's explicit multi-GPU support and a low end dGPU in something like AotS.
I thought the 1050ti was 75W, hence why you generally dont need additional power on the card?
I appreciate its only 25W less, still doesnt stop me lusting after my holy grail of a small, cheap, quiet and does ok at 1080p box.
I think that's the vanilla 1050, non-Ti
According to Nvidia, Ti is 75W
Also, only 2 game tests in an APU review? Why not even test the type of games these chips are aimed at, at least? (CS:GO, WoW, Fortnite).
Bought an 2200G, bought MSI B350M Mortar motherborad. Of course it arrived with BIOS from March 2017 (version 1.1). Luckily i had a R7 1700 system, so i could pop in that CPU to flash it, but this must be super annoying for someone without that option.
There you go... Roman’s delid is up on YouTube.
I told you it wouldn’t be long.
Other sites are speculating the lower performance with a discrete card is down to the x8 PCIe interface. When testing with a standard ryzen clocked the same the GPU's perform faster, indicating the cut from x16 to x8 could be hurting the discrete performance.
The comments in the initial pages of the article mention it as unlikely, the testing shows it may be happening, but there is no comment on the point in the conclusion. What is your final opinion on the idea?
There isn't much of an impact:
Separate names with a comma.