Discussion in 'Article Discussion' started by Gareth Halfacree, 13 Nov 2014.
They would just post a "Why we havent done a GameAAA-XXX review yet" article.
I think it was more the fact that you commented without offering an answer to the question - just came across as needlessly sarcastic. That old fave where text often gets taken the wrong way since you can't determine tone from it easily (probably why we do use the emotes). Never mind - let's forget it and move on.
So to my question, are you saying that even if you do have the spare GPU horsepower, there is nothing you can turn up to make it better on a PC vs a console? I agree, that would suck and sales should suffer if it affects the game. Surely though there's a frame rate at which it becomes irrelevant how fast they're being churned out? I thought it was more of a benchmark, since the eye/brain can't act faster than a certain speed anyway ie. no-one hyperspeeds through games at 500fps just because their card can do it, as it would be unrealistic. I was under the impression that if your card could produce daft frame rates, it just meant it could also do lower ones at higher detail. Happy to be corrected though...
True in normal circumstances, but my point was that if site A gets the copy a few days before release but can't publish their commentary until a few days AFTER release, then review site B is in a better position, having not signed up to the embargo.
Depends entirely on the game, and how long Site A has with it prior to launch. For example, let's say Site A gets the game a week before launch, and the embargo is set for two days post-launch. Site A, then, has up to nine days to play the game before the review goes live. If Site B wants to beat it to the punch, it has to go out and buy a copy on launch day and post the review one day post-launch. Site B, then, has a maximum of 24 hours to play the game before the review goes live. Note that this doesn't include any time to write the review itself, either!
Could you do thorough review of, say, Gone Home in 24 hours? Sure. I completed it on my first run-through in two hours. Could you do a thorough review of Skyrim in 24 hours? Heck, no - that game has hundreds of hours of content.
'Course, the flip-side of this is that problems with some games - especially multiplayer games - may not make themselves known until after launch. If I'm playing, for argument's sake, a new World of Warcraft expansion a week before it goes live I'll be running around the servers with a handful of fellow games reviewers and my experience will be a pleasant one with regard to wait times and lag. Post-launch, when eight million or so WoW fans get in on the action, wait times and lag are going to skyrocket - and a review that waits to see that happen will be more accurate of the real-world experience than my pre-release look-see.
People don't do FPS.
By that i mean we have a constant stream of visual data entering the eye and being processed by the brain, comparing how fast images are displayed to how fast we can take them in is a little at odds with each other.
But we then have to have our responses translated into fps when we use the mouse or keyboard though - so I can see how a 10fps speed would be bad, since we're trying to run/jump/aim faster than the GPU can translate our actions and we get frustrated, but to use the other extreme, I don't see how a 200fps card benefits us as we're not inputting that fast.
So if 30fps is fine for gameplay based on the CPC testing, how does 60fps or higher equal better gameplay? Our input shouldn't be any quicker at 60 than 30, surely?
Definitely understand the rest of your post, Gareth - out of interest though, with a game as big as Skyrim, how long after launch was your review posted? Do sites normally take the time to explore much of the game given no embargo, or do they try to get the exclusives?
I've never reviewed Skyrim, so the answer to your question would be "it still hasn't been published." Point of fact, I've never reviewed any game ever - I'm not a game reviewer.
(Oh, wait, I tell a lie. I wrote a review for a friend's site once, years ago. Hydrophobia, I think the game was. That was a freebie, though - I didn't get paid. The site's long-gone, too. Wonder if I have a copy of the review saved anywhere?)
EDIT: Wait, I did write a paid review of a game once - Portal 2. I'd completely forgotten. T'was for the late, lamented _thinq. Man, I miss that site. In that instance, I'd bought the game with my own funds and wrote the review after I'd completed it on one run through and found at least some of its hidden secrets. As I recall, the review was published a couple of days post-launch - but Portal 2 is, naturally, a far shorter game than Skyrim, so it doesn't really help with your question.
I meant the bit-tech review! Just assuming you had the stats really, but did a quick search and found this comment in the review: "We won't even pretend that we've completed Skyrim in the few days we’ve had with it"
Nice to know about your CV though ;-)
I know it says "bit-tech Staff" under my avatar, but that's a bare-faced lie: as my signature says, I'm a filthy freelancer. I do nowt but three news stories a day for this 'ere publication, and have no real insider knowledge. Well, aside from being able to see what t'other's have scheduled in the shared calendar for the next week or so, anyway.
True we are not inputting that fast, but we are responding to what we see when it happens, slower frame rates add to the latency queue.
30fps maybe fine for gameplay based on the CPC testing, myself and others would disagree.
Ah, right - I think I gotcha. So basically, as siliconfanatic mentioned earlier, Ubi's play is bad because it's capping the PC frame rate too low to be considered a decent speed, whereas on the consoles it makes more sense because they put out less detailed graphics?
So is it still right to say that above a certain rate (say 60-90fps) the extra fps is better off being traded for more detailed output since it's at the peak we can optimally interpret?
EDIT: Found this:
Pretty much spot on. 60-90 FPS is the optimal range.
But there's also the cushioning factor. if the framerate drops 30 FPS randomly, if it's set to 60, that means it drops to 30, and you get a "choppy" picture. If it's at 80 or so, you're still within an acceptable range.
There's also the bit where most movies play at 30 fps. From what I've read the only reason we don't sense this is due to a massive amount of motion blur. Just try snapping a picture and it's readily apparent.
There's also the bit where they're gimping the resolution. Honestly it's not so much resolution that matters, so much as PPI does. Pixel density is what really matters.
If you have a 22 in display, 4k is going to rock your socks off, whilst 144p is going to make your brain hurt.
That pretty much cover the whole issue of graphics gimping?
Wait, look to watch dogs for a second. They gimped not just FPS or res, but the whole damn thing. Look at the demo vs the released gameplay.
Ubi's getting a longer and longer rap sheet of pulling stunts like this. Not going to take long before it comes back and bites them in the arse. Hard.
If you have a spare 10min to waste the following YouTube video by LevelCapGaming does a good job of explaining things.
The absolute majority of monitors sold can't display more than 60 fps, so everything above 60 is a waste (unless you pay a premium on your monitor) as you can't see whats never displayed in the first place.
Another thing with FPS that doesn't apply to all games, is that some games have thing like input tied to the FPS. That means that if the game is running at 60 FPS the game will register input within 1/60th of a second. If, however, the game is running at 30 FPS then it takes twice the time to respond.
If you want to be continuously running 60fps 99 percent of the time, chances are you need to be capable of running higher than that for the majority of the time to allow for the inevitable frame drops.
So its not a waste if you want you minimum frame rate above 60. I prefer smoothness over fidelity.
Yep, 60 min, not 60 avg, prob should have mentioned that.
Separate names with a comma.