bit-tech.net Forums

bit-tech.net Forums (http://forums.bit-tech.net/index.php)
-   Gaming (http://forums.bit-tech.net/forumdisplay.php?f=8)
-   -   PlayStation 4 Playstation 4 (http://forums.bit-tech.net/showthread.php?t=252653)

Jasio 9th Dec 2012 18:50

Playstation 4
 
So it seems that Sony is currently referring to its PS3 successor as 'Orbis' - they have not adopted the Playstation moniker yet (though maybe they won't).

The second revision dev units have apparently been shipped out and reflect the final hardware fairly closely (apparently). They are based around the AMD A10 APU - likely the 5800k - so it is reasonably fair to assume that the next PS will be a quad-core CPU clocked at around ~3.8Ghz though I wouldn't be surprised if they scaled them back a bit for additional power savings.

What is more interesting is that the A10 has a 7660D Radeon built into the APU which is fairly snazzy - and should have fairly good performance. The new standard for Sony is apparently 1920x1080/60p [or 120i 3D] and the 7660D should probably be able to handle that. I believe Anandtech tested the A10's GPU with Crysis just under 1080p and managed to get about 70fps. Considering Crysis is not optimized for the platform- it seems to be a fairy reasonable bit of hardware.

There are two flavors apparently kicking around- one with 8GB of DDR3 RAM and another with 16GB DDR3 RAM. Not sure if they plan to have 2 variants, or if it's just for internal testing to see whether they can get by with 8GB to save money.

Those familiar with the PS3, Wii + Wii U, Xbox 360 will know that they use IBM PowerPC CPU's. They tend to be quite hot, 40nm - and are expensive.

The move to an x86 based platform by Sony would actually be an extremely smart move.

- Sony seems to have learned that expensive bespoke hardware is not the way to make money off a console.
- The A10-5800k SOC is cheap retail- so in large batches AMD will benefit from sales and Sony will benefit from low costs.
- The concept of 'porting' might disappear as developers will only be optimizing for resolution, rather than PowerPC. This is actually good for PC gamers.
- Using fairly stock hardware we should see a cheaper retail price.
- Blu-Ray is still onboard; so optical media will stick around for awhile.
- WiFi + Wired is still onboard. Not sure about bluetooth, but given backwards compatibility it'll like still be around.
- Based off x86 hardware, packing an A10 SOC, and plenty of RAM we could see this developing into a very neat HTPC as you might be able to thrown Linux on there with XBMC or even Windows 7/8.

Storage seems to still be HDD - 500GB. But that's generally the least important design decision- it could change at any time. Honestly seeing a 128GB or 256GB SSD variant would be better.

The final production revision is expected May-June 2013.

No pictures- Sony sends them out in regular desktop PC cases.

Parge 9th Dec 2012 19:07

Where are you getting the above info?

Jasio 9th Dec 2012 20:04

Quote:

Originally Posted by Parge (Post 3235975)
Where are you getting the above info?

http://oyster.ignimgs.com/wordpress/...oChristmas.jpg

law99 9th Dec 2012 20:10

Urrr... ?

Neogumbercules 9th Dec 2012 20:15

There's no denying the 5800k/7660D is a cheap, smart and efficient option for Sony. Also no doubt it's much more powerful than the GPU in the PS3, which I believe is some kind of cut-down nvidia 7800.

That said, it's not nearly as powerful as having a dedicated GPU in there. Last I saw, rumors were flying that the PS4 was packing a Sea Islands dedicated GPU, which would make it a formidable machine.

I just don't want to see them put in weak-ish hardware and have a situation 7 years from now where they stretch out the generation far longer than the hardware can cope with. Though I will admit that devs have done some amazing stuff with the hardware they're stuck with. Uncharted 3, God of War 3, Halo 4, etc.

GoodBytes 9th Dec 2012 20:27

Assuming it's all true, boys, say bye bye to Sony then.
The company is bleeding money like no tomorrow. Worst than Nokia and AMD. They can't afford another 500$ console. The problem with the PS3, is that while it is selling, people are buying it for the Blu-ray part not the games. So Sony is loosing a lot of money. Teh PS Vita isn't doing well as well. It's way to pricey, and they still lose money. And people dont' buy a lot of games, due to lack of quality content for it. If SOny was smart this is what they should have done:

1- Replicate the WiiU specs.
2- Every penny saved by not having the gamepad, could be transferred onto getting a better CPU, and hint faster GPU (the GPU will automatically be much faster, as the console will have to render on only 1 screen, and not 2). Now, they'll have a 350$ console, that beats the WiiU in all aspects in terms of performance and graphics. You are done. Plus, you can push the PS Vista as a gamepad alternative fully, to compete with XBox SmartGlass, and WiiU

Microsoft is going all Kinect and Media Center on the next XBox. So expect most of the money to be on set on PVR like features, and Kinect system. The console will be better of course than the WiiU in term of performance as, like the PS4, even the one I suggest they do, as they can get better hardware 1-2 years from now, and also they can afford to lose money as Microsoft makes massive amount of money with their subscription online multi-player service

Jasio 9th Dec 2012 20:32

Quote:

Originally Posted by Neogumbercules (Post 3236024)
There's no denying the 5800k/7660D is a cheap, smart and efficient option for Sony. Also no doubt it's much more powerful than the GPU in the PS3, which I believe is some kind of cut-down nvidia 7800.

That said, it's not nearly as powerful as having a dedicated GPU in there. Last I saw, rumors were flying that the PS4 was packing a Sea Islands dedicated GPU, which would make it a formidable machine.

I just don't want to see them put in weak-ish hardware and have a situation 7 years from now where they stretch out the generation far longer than the hardware can cope with. Though I will admit that devs have done some amazing stuff with the hardware they're stuck with. Uncharted 3, God of War 3, Halo 4, etc.

The APU has several benefits, and it is important to keep in mind that "HD" gaming is 1920x1080 - there is no need for overpowered GPU's designed to push multiple displays, or anything really beyond 1080p. You need a consistently performing platform. The A10 seems to fit the bill quite nicely.

There is the cost aspect. Sony learned the hard way. Seeing a dedicated GPU is unlikely.

Heat generated by a system with multiple components, rather than SOC also means you draw more power, and need to dissipate it somewhere. My 60GB PS3 - as much as I love it - it a giant, heavy, hot brick. Down sizing is smart in general.

Given what a console is generally expected to do, I am quite happy with the A10 being used. Most importantly it's x86.

Avoiding a dedicated GPU also means Sony can upgrade the "PS4" by merely swapping the A10 SOC out (since it just sits in a single socket) for something newer when AMD releases it. Such as a die shrink, or even an improved GPU (as they are packaged together). The PS3 has not had its GPU upgraded- however it has undergone three die shrinks on the CPU.

Seeing 8GB on the base unit is a good sign. No more 256MB XDR memory as with the PS3- which when released was already well behind the curve.

All things taken into consideration- its hardware is on par with current decent everyday gaming PC for non-extreme users.

Quote:

Originally Posted by GoodBytes (Post 3236033)
1- Replicate the WiiU specs.
2- Every penny saved by not having the gamepad, could be transferred onto getting a better CPU, and hint faster GPU (the GPU will automatically be much faster, as the console will have to render on only 1 screen, and not 2). Now, they'll have a 350$ console, that beats the WiiU in all aspects in terms of performance and graphics. You are done. Plus, you can push the PS Vista as a gamepad alternative fully, to compete with XBox SmartGlass, and WiiU

You're kidding right? The Wii U is a terrible piece of hardware.

It's using a triple-core PowerPC-750 clocked at 550Mhz, based off the GameCube; which coincidentally also used the PowerPC-750. It has a Radeon 4000/5000 based GPU clocked at 649Mhz.

Backwards, is not the means to go forwards. If you expect Sony to save money, going x86 and running away from PowerPC is the *best* thing to do. It means porting games no longer takes years because it is the same platform as the PC. There is a lot of time and money saved in that.

Using an off-the-shelf APU that isn't an expensive, slow, PowerPC is also very smart. It means Sony isnt' wasting money on R&D and having to adapt their SDK's to a different architecture. Cost savings.

The entire premise of my initial post *is* about cost savings. The Wii-U's lethargic processor will never push the 1080/60p + 3D content Sony is aiming for from the start. It also has a longer development cycle due to porting.

faugusztin 9th Dec 2012 20:42

Quote:

Originally Posted by Jasio (Post 3235965)
What is more interesting is that the A10 has a 7660D Radeon built into the APU which is fairly snazzy - and should have fairly good performance. The new standard for Sony is apparently 1920x1080/60p [or 120i 3D] and the 7660D should probably be able to handle that. I believe Anandtech tested the A10's GPU with Crysis just under 1080p and managed to get about 70fps. Considering Crysis is not optimized for the platform- it seems to be a fairy reasonable bit of hardware.

LOL. For starters, Anandtech didn't test A10-5800K with 7660D, but with a discrete graphics.

Performance is much, much worse :
http://www.bit-tech.net/hardware/cpu...d-a10-review/7

With 7660D, at 1920x1080, we talk about 9.4FPS in case of Crysis 2 at 1920x1080 :
http://www.tomshardware.com/reviews/...ce,3304-7.html

Parge 9th Dec 2012 20:51

Quote:

Originally Posted by Jasio (Post 3236013)
[IMG]snip[/IMG]

I'm sorry, that image doesn't in any way answer my question, so everything you've said so far might as well be made up.

I heard its using a 3930K, triple 690's and 32GB of RAM. Sure, less likely, but my source is as valid as yours.

rollo 9th Dec 2012 20:54

Wiiu specs are pretty awful.

Lack or dedicated Gpu will be a problem you can't just upgrade the CPU in a ps4 when AMD release something faster as you will have issues of games not running on launch hardware and people been pissed off.

The best thing about a console for most is the ease of use if it says ps3 and you have a ps3 you can be sure it will work. If you have 2 dif versions of ps4 issues will arise.

Jasio 9th Dec 2012 20:57

Quote:

Originally Posted by Parge (Post 3236050)
I'm sorry, that image doesn't in any way answer my question, so everything you've said so far might as well be made up.

I heard its using a 3930K, triple 690's and 32GB of RAM. Sure, less likely, but my source is as valid as yours.

That's fine. The only difference is- your example is bull. :hehe:

GoodBytes 9th Dec 2012 20:59

Quote:

Originally Posted by Jasio (Post 3236036)
You're kidding right? The Wii U is a terrible piece of hardware.

No it doesn't. Doesn't mean that the lunch title was not amazing as due of being rushed out, and that the developers don't know how to use GPGPU abilities of the system (new to gaming console), and don't know anything about optimizing content to it as its brand new, that the console has terrible hardware. Early XBox 360 games looks like XBox 1 games. It was terrible.

Quote:

It's using a triple-core PowerPC-750 clocked at 550Mhz, based off the GameCube; which coincidentally also used the PowerPC-750. It has a Radeon 4000/5000 based GPU clocked at 649Mhz.
Wrong. The WiiU uses a 1.2GHz triple core CPU by IBM, while very similar to the XBox 360 processor, it is using a modified architecture, and the processor is on the same die as the GPU, with direct communication between the two. This allows massive performance increase. We saw this when Intel ported their GPU from the North bridge to the CPU. Just doing that nearly double it's performance. The WiiU uses a Wii emulator, to play Wii games. The Wii didn't need any emulator to play GameCube games, as the processor was identical to the game cube one, just a much faster version of it.

The GPU is a powerful Radeon 5000 series DirectX 10 ready (but won't be used, as it needs to run a Windows OS, but has the equivalent from OpenGL).

The GPU support GPGPU (AMD version of Nvidia CUDA), so developers can offload part or all of the physics and AI onto the GPU, if CPU performance is needed. Seeing how no games so far, and select few on the PC even uses this feature, it is normal that developers currently have a hard time using it, let alone know how to optimize it.

Quote:

Backwards, is not the means to go forwards. If you expect Sony to save money, going x86 and running away from PowerPC is the *best* thing to do. It means porting games no longer takes years because it is the same platform as the PC. There is a lot of time and money saved in that.
Wrong again. Developer didn't initially support the PS3 as market share was excessively low. Sony had to buy A LOT of exclusive titles, and push for second party titles, and massive advertisement to increase its share. Most games are done on consoles than ported to PC. Also, Most game engine are already multi-console support. So porting a game cost under a million dollars. It's mostly testing and optimizing to the console.

x86 processor are really not efficient processors, and very costly.

Quote:

Using an off-the-shelf APU that isn't an expensive, slow, PowerPC is also very smart. It means Sony isnt' wasting money on R&D and having to adapt their SDK's to a different architecture. Cost savings.
This is true.

Jasio 9th Dec 2012 21:02

Quote:

Originally Posted by faugusztin (Post 3236042)
LOL. For starters, Anandtech didn't test A10-5800K with 7660D, but with a discrete graphics.

Performance is much, much worse :
http://www.bit-tech.net/hardware/cpu...d-a10-review/7

With 7660D, at 1920x1080, we talk about 9.4FPS in case of Crysis 2 at 1920x1080 :
http://www.tomshardware.com/reviews/...ce,3304-7.html

http://www.anandtech.com/show/6332/a...eview-part-1/2

Wrong as usual faug :)

Jasio 9th Dec 2012 21:11

Quote:

Originally Posted by GoodBytes (Post 3236060)
Wrong. The WiiU uses a 1.2GHz triple core CPU by IBM, while very similar to the XBox 360 processor, it is using a modified architecture, and the processor is on the same die as the GPU, with direct communication between the two. This allows massive performance increase. We saw this when Intel ported their GPU from the North bridge to the CPU. Just doing that nearly double it's performance. The WiiU uses a Wii emulator, to play Wii games. The Wii didn't need any emulator to play GameCube games, as the processor was identical to the game cube one, just a much faster version of it.
.

My mistake- it is a 1.2Ghz triple core. It is PowerPC, and it only has one thread per core.

The processor is not on the same die as the GPU. It is on the same package, not the same die.

http://images.anandtech.com/reviews/...o/WiiU/MCM.jpg

Quote:

Originally Posted by GoodBytes (Post 3236060)
Wrong again. Developer didn't initially support the PS3 as market share was excessively low. Sony had to buy A LOT of exclusive titles, and push for second party titles, and massive advertisement to increase its share. Most games are done on consoles than ported to PC. Also, Most game engine are already multi-console support. So porting a game cost under a million dollars. It's mostly testing and optimizing to the console.

x86 processor are really not efficient processors, and very costly.

Nope. I dont' see why you are trying to compare the PS3 to the hardware of the PS4. It has nothing to do with market share- the product isn't out yet. Don't use a previous generation console to try and support a yet-unreleased-console.

x86 efficiency is irrelevant, it always comes down to cost. IBM has less volume, on an older process. x86 is cheaper and mass produced in consumer hardware, I am not quite sure how you managed to get a higher cost out of a high volume, high yield, budget product like the A10 which sells for $115.

Mr Happy 9th Dec 2012 21:17

But can it run crysis....................just kiddin. How much will they be asking for games for the next gen consoles???

Unicorn 9th Dec 2012 21:37

Quote:

Originally Posted by Parge (Post 3235975)
Where are you getting the above info?

Yeah, I can see just how necessary it was to quote the entire OP (wall of text that it is) in the first reply to the thread. Totally sensible.

:rolleyes:

law99 9th Dec 2012 21:55

There is no way in hell they'd use the onboard 7660. They did sign some sort of agreement with amd iirc. This will surely only be for GPU...

I really think, if they've learnt anything, it will be to have a more flexible memory architecture for simpler cross platform titles.

GoodBytes 9th Dec 2012 22:31

Quote:

Originally Posted by Jasio (Post 3236072)
The processor is not on the same die as the GPU. It is on the same package, not the same die.

Yes my mistake. i meant package. In any case, it still has direct link to it, and is incredibly close form each other. Closer things are, the better. Hence why your system memory is so close to your RAM. And hence why processor manufacture integrate more and more things within the processor. Before, on the PC side, a motherboard made some major performance difference. Today, any motherboard will give you very similar performance, and motherboard manufacture have much less in their hand to differentiate themselves.

Quote:

Nope. I dont' see why you are trying to compare the PS3 to the hardware of the PS4. It has nothing to do with market share- the product isn't out yet. Don't use a previous generation console to try and support a yet-unreleased-console.
Its speculations. That is how it works. You also don't know anything either. At least I use past experience. Not magic.

Quote:

x86 efficiency is irrelevant, it always comes down to cost. IBM has less volume, on an older process. x86 is cheaper and mass produced in consumer hardware, I am not quite sure how you managed to get a higher cost out of a high volume, high yield, budget product like the A10 which sells for $115.
If x86 architecture processor was so great, believe me, it would be used. ESPECIALLY by Nintendo. Nintendo is all about: how do can we take inexpensive, yet solid products (processors, sensors, etc.) and create something new and completely innovative, all at an affordable price. Nintendo is a toy manufacture. Always was and always will be. Their goal is to always innovate. Innovate in the sense of not about creating something new that never existed, but how to use what exist to provide something that was not thought about.. well mostly.l Sometimes they really innovate. And every time time they do, it tends to change the gaming industry. It has never been about the most powerful console, and never will.
If you ever have hope up for a powerful console with Nintendo, you'll always be disappointed. From the NES (family computer) to the WiiU. Yet, Nintendo, usually have a huge success, which tends to, for some reason which I don't get, pissed off, and start bashing things. I don't get it. Just buy 2 consoles, or don't. Why get angry and bash things.? Anyway, this is off topic.

Microsoft tried the x86 approach. It was called the XBox. It was using a Pentium 3 733MHz, and a Nvidia GPU. Why did Microsoft changed? And to consider that the original XBox was based on Windows NT OS, they had to heavily modify it for the XBox 360. You would think it would be more costly, but clearly its not.

Jasio 9th Dec 2012 22:37

Quote:

Originally Posted by law99 (Post 3236098)
There is no way in hell they'd use the onboard 7660. They did sign some sort of agreement with amd iirc. This will surely only be for GPU...

I really think, if they've learnt anything, it will be to have a more flexible memory architecture for simpler cross platform titles.

There's two ways to think about this... and while both are valid, there are some points to keep in mind.

A) You argue there will be a dedicated GPU. From a technical epeen angle, this makes sense. The PS3 did set the bar quite high when it was released in terms of its sheer processing power (though it was gimped by memory). There was a fairly decent GPU in there too- so why wouldn't Sony follow on with this design decision again? Well they can, there is nothing stopping them and it could make it easier for the consoles long-term viability as dedicated GPU are more powerful then integrated ones.

B) You argue that an integrated GPU is the way to go. Taking a page out of the Wii U, it is quite possible that this is a sign of things to come. The PS3 was a power hog, chucked out a lot of heat, and in its first iteration before a die shrink was heavy and took up a lot of space. The two key draws on power being the CPU and the GPU. Modern integrated GPU's such as those found on the AMD A10's are not all that bad. They do a decent job for HTPC and light/medium gaming use for someone who doesn't push multiple displays or insane resolutions. There is also the cost factor: The packaged SOC is cheaper, draws less power, puts out less heat, and takes up less space. Those are all very good things from both a manufacturers cost perspective and the consumers perspective. You can say that we live in times of "austerity" and Sony probably cannot handle another huge loss like the PS3. Looking to an 'all in one' package is probably going to be the route all the big guys go for despite what we might really dream of / want.

Both sides work- I really do want a stand-alone GPU but I do not see that happening. The dev units point to that.

We are also ~6-8 months away from the final spec, and the PS4 might end up getting packaged with A10's replacement Richland, Trinity's successor.

law99 10th Dec 2012 00:10

But older stuff was just less efficient. You can see the slim as an example of power consumption falling. Also, it wasn't exactly the biggest complaint that needed addressing.

The biggest complaints would have been with ps3:
  1. The cell chip - backfoot due to requirement of new dev tools to support third party partners
  2. Non-unified memory architecture - both the 360 and ps3 had 512mb of ram. NUMA granted more performance but unintentionally hobbled the playstation. The 360 had a slower single bank which gave and afforded easier development

I'd expect to see hd7950 performance or hd6950 with as low power consumption as feasible. I wouldn't even be too surprised to see a return of the cell. Iirc K Kuturagi or whatever he was called envisioned continuing down the cell road. I'd imagine the leg work to be mainly done by now. Maybe we'll see the 4.2ghz we were promised.

Edit: The 360 in general had a more flexible gfx card also. Less fixed function... Nvidia joined the party when they launched the 8800. Someone with more expertise should explain that... It's all old misfirings in my brain regarding these consoles


All times are GMT +1. The time now is 00:30.

Powered by: vBulletin Version 3
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.