Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

With a 36-million installed base with compatible hardware, ~$150 lower starting cost (including PS4 camera), and the ability for people without a headset to play on the main screen in the livingroom, it looks like Sony might clean up on the first generation of consumer VR.


This will certainly be a big step for VR democratisation. I'm surprised they aim for 120hz, it's really amazing. Can't wait to see what gets out of it.


That's an 8 millisecond frame time! I've seen physics simulations that take longer, never mind graphics intensive systems like post-processing (think Depth-of-Field, Tonemapping, 'HDR'/Bloom).

I'm skeptical at how this will affect the quality of upcoming VR games since less time to render generally means a scene that is less photorealistic. Then again, maybe the immersion (presence) factor will overcome the graphical drop in quality?


Photo realism isn't going to happen in the first generation. Game developers are going to have to be creative in art direction and gameplay instead. And for games that do this well you won't ever notice a drop in quality.

To me, that's one of the most fascinating things about games. Breakout hits make the right trade offs to push the platform further than you thought it could go. This stands true since games were put on computers.


At this point, photorealism isn't so much a question of hardware limitations as one of scene complexity and interactivity.

If you have a simple enough subject to start with, and bake enough lighting and shadow mapping into the textures, you can make a photorealistic scene on even previous-generation console hardware. Once you start adding more objects, and allowing more things to move, you start needing to calculate more in real time, and you need more processing and rendering power.

A room with a table and a teapot could probably be made photorealistic on a PS3, even at 90 FPS. A densely populated street with moving cars and pedestrians, however, would have to be fairly visually abstracted to hit that target even on a PS4.


And also stuff that's not games (like say interactive stores, presentations, etc.) can easily hit that frame rate because of simpler scenes


They compensate for this with reprojection. If the off board hardware hasn't received a new frame within 8 milliseconds of a move event, they reproject the existing frame moved over by the delta of the move event. I assume the thinking is that you're mostly focusing on one spot, so this ends up "moving" correctly even if the image as a whole isn't quite right (edges cut off or whatever).


This is mostly correct. The games actually get a choice about how they want to render Most games will choose to render at 60fps, the game reports this to the OS and then the headset will know to reproject between each frame, so the game gets plenty of time to get the next actual frame ready.

Some games will do 120fps though, not sure who will pull that off voluntarily though!!!


I believe reprojection is always active on the PSVR, whether you're rendering at 60, 90 or 120 fps.


Most pixel shader post-processing doesn't work too well with VR to start with, certainly any current implementations will look a little freaky when viewed through two eyes depending on shortcuts that are only convincing from a single camera perspective.


It's okay if the physics run at 30fps, the key is that head movement is instantly tracked with a new frame.

It'll be interesting to see what quality the PS4 can support on this, but we already know good games are possible with simpler graphics.


So you're saying we can't tie physics to the framrate?


you should never tie physics to frame rate.


> I'm skeptical at how this will affect the quality of upcoming VR games since less time to render generally means a scene that is less photorealistic. Then again, maybe the immersion (presence) factor will overcome the graphical drop in quality?

For now, by definition VR graphics (which mandate high FPS, high resolution, rendering for each eye) are always going to be behind state of the art graphics for non-VR games.

Personally, I find the idea of non-photorealistic VR environments much more interesting as well.


I think the smart move for the time being for VR is to put realism to the side in favor of strongly art-directed stylized content that focuses on the strengths that the platform has right now and worry about realism later.

I think films are a great example of demonstrating that realism =/= immersion. Movies like Lego Movie, The Incredibles, etc can be just as immersive as live action. The secret sauce isn't realism, it's choosing the right style for the content, gameplay and theme you are going for.


The refresh rate does not necessarily match 1:1 with the frame rate. You can display the same frame multiple times, and this is actually what Sony is doing.


I saw somewhere that this number is apples to oranges as far as comparing it to the Rift and Vive. Trying to find a source.


120hz means smooth playback at 24, 30, 60 fps (3 most common formats), because you can simply show each image for 2, 4, 5 frames. 60hz would mean potentially stuttery video at a native 24fps (some movies).


I believe it's interpolated from sixty rather than the true 90Hz that the Vive and Rift will be using.


Nope. That's one of the options possible, but not the only mode. The device can present at 90 or 120, and the rendering can be done at 60, 90, or 120. Not sure of all the combos possible though (like can you render at 60 and present at 120?)


In their slide decks they showed a 60->120 mode based on frame doubling plus asynchronous time warping (so head movement is partially taken into account). I don't think they actually allow you to run it below 90hz.


You can run at 90, but I think 60 -> 120 will be the common strategy given easier development and ultimately better refresh rate


I think the word you're looking for is "adoption," not "democratization." Nobody is opening up VR to a vote.


Democratization of technology refers to the process by which access to technology rapidly continues to become more accessible to more people.

https://en.wikipedia.org/wiki/Democratization_of_technology


That's a very strange use of the word democratization, a domain error really, because it's about markets, not politics. A better word would be consumerization. Associating democracy with a particular economic pattern is just a swindle to try to make the two inextricable. I suppose that's why Thomas Friedman was so fond of this abuse of language.


No, it's not about markets nor any particular economic pattern; it's about the outcome, while being agnostic to the means. For example, it's also been said that the expansion of (free) public schooling has lead to a democratization of knowledge.


: to make (something) available to all people : to make it possible for all people to understand (something)


"consumerization" just tells you that something is going into the market, not that it's becoming accessible to everyone. "Democratic" is not the best word but it's better than that.


You are correct, that was the word I was looking for. I think democratization works too ;)


The same way they cleaned up for the Blu-Ray. The latest Playstation's have become gateway drugs to Sony's technologies.


I actually think being tied to a PS4 is a disadvantage, not an advantage, and a large one at that. The install base of high-end PCs is larger, and upgrading a PC is easier to justify than buying a game console. And for most people, the living room isn't the right place; when you can't see your surroundings, you'd rather be somewhere private.


> The install base of high-end PCs is larger

I'm not sure that's true, although it depends on what you define as "high-end". Steam, as of last February, has 125m active users [0]. According to their hardware survey [1], only 30% of those users' GPUs support DirectX above version 9, with the overwhelming majority of "GPUs" being integrated and not up to the demands of 1080p gaming (a cursory glance suggests we can cut that 30% in half at least).

Given all that, you have ~15% of 125m for a maximum of ~19m "high-end" PCs, which is substantially smaller than the current number of PS4s.

> and upgrading a PC is easier to justify than buying a game console

A GTX 970, the minimum required for Oculus, is around $300 on Amazon. That's not withstanding the extra CPU and RAM upgrades that are likely necessary (per [1]) to support the demands of modern PC games.

There's also the fact that PC games tend to require more and more hardware upgrades as time goes on, while console games tend to use the hardware they have more efficiently. A $350 console can be expected to last 5-10 years without need for an upgrade. A PC needs upgrading every ~3 to keep up to date with modern requirements.

I say all this as someone who exclusively games on PC, don't get me wrong. It's great but it's not the best financial decision for the mass market.

[0]: https://www.vg247.com/2015/02/24/steam-has-over-125-million-...

[1]: http://store.steampowered.com/hwsurvey


Where did you get that so many GPUs are integrated?

> A $350 console can be expected to last 5-10 years without need for an upgrade. A PC needs upgrading every ~3 to keep up to date with modern requirements.

But will they be able to run games at similar quality and FPS? Many console games run at 30 fps.

http://www.extremetech.com/gaming/206070-the-witcher-3-cant-...


> Where did you get that so many GPUs are integrated?

The stats are broken down further here: http://store.steampowered.com/hwsurvey/videocard/

You can see that Intel integrated modules make up a sizable portion (~45% of DX11) of most sections, with much of the remainder after that being the kind of GPU you see built into the motherboard on low end PCs.

And after that there are a lot of dated cards like the 9500 GT.

> But will they be able to run games at similar quality and FPS? Many console games run at 30 fps.

That's a good question. I think it's possible though. Developers managed quite a lot on the PS3 and Xbox at 1080p and 30fps. I think that framerate just wasn't a major concern for them at the time. Now they have both new hardware and a strong consumer interest in VR, I think they'll be able to manage it.


A 1000$ PC will probably be able to run games at 1080p at ~60 FPS while the console will run at 30 FPS. One must compare the same games, in which case one will find that the FPS and resolution in consoles is less. So then the comparison is between paying more for better resolution and/or FPS vs paying less for lower resolution and/or FPS.

Also, if you don't upgrade your console, you won't get to play newer games, but if you upgrade your PC every 3 years or so, you'll be able to play newer games, so that isn't a correct comparison. You can always play your old games on your PC forever, without needing to upgrade anything.


Consoles are on a 6 year upgrade cycle, if you wait 10 years your going to be skipping a generation. PS 1994, 2000, 2006, 2013.

That's still probably faster than PC as a 1000$ gaming PC from 2012 would have 4 or 8 GB ram, 1TB HDD or a small SSD, ~570 GTX 2500k CPU. Which can still play new games but your going to want an upgrade soon ish.

Cheaper games, no subscription fees (X box live), and your going to want a PC anyway means consoles don't really save you money.


You're right about the upgrade cycle being shorter than I thought.

> That's still probably faster than PC as a 1000$ gaming PC from 2012 would have 4 or 8 GB ram, 1TB HDD or a small SSD, ~570 GTX 2500k CPU. Which can still play new games but your going to want an upgrade soon ish.

I don't think this is true. In 2013 [0], for $1000, you'd be looking at a GTX 660 and an i5 4430. Witcher 3, one of the largest PC gaming releases of 2015, required [1] a GTX 660 as a __minimum__, just 2 years after you'd have bought that PC. And this is a card that could handle Skyrim on Ultra settings [2] when you bought it.

To keep a PC up to date (playing new games at the quality settings you used when you bought it) you're looking at $1000 initial cost, ~$150/yr for the GPU and $50-100/year for a CPU/RAM upgrade every few years.

So for 10 years of ownership, you're looking at $1000 for the initial purchase, 3-5 $250 GPU upgrades ($750-$1250) and say two major overhauls of CPU/RAM/Motherboard/PSU ($500-$1000). Total cost of ownership is ~$2250-3250, depending how long you stretch it.

For a console you're looking at say 3 console purchases (since you keep them longer, at least one is likely to need replacing at some stage) each at ~$400. Total cost of ownership is ~$1200.

> Cheaper games

Tom Clancy's The Division, the most recent major new release I'm aware of is currently $60 on every platform [3]. This has been my experience with most major games. If you're commenting on indie games, those are available on consoles too.

> no subscription fees (X box live)

A concern exclusively for Xbox owners. This thread is regarding a Playstation announcement.

> and your going to want a PC anyway

Most people can (and do) get along perfectly fine with a cheap $300 laptop which lasts them 2-5 years depending on the people.

I get where you're coming from but I think you're biased by your personal preferences/situation/experiences. Gaming PCs just aren't more cost effective than consoles unless you happen to need a beast of a PC for other purposes (like software development). They just don't make sense for most of the market.

[0]: https://web.archive.org/web/20130501090947/http://www.logica...

[1]: http://www.pcgamer.com/the-witcher-3-system-requirements-ann...

[2]: http://www.tomshardware.com/reviews/geforce-gtx-660-geforce-...

[3]: http://www.amazon.com/Tom-Clancys-Division-Xbox-One/dp/B00DD...


I'm not where you've got your numbers from. I bought my midrange desktop including monitor and peripherals for less than $900 three years ago and yet from a performance perspective it's still 30% faster than the PS4. Somehow that PS4 lasts 10 years but the PC has to be upgraded every year? I merely bought a new mouse and an SSD but you could probably say the same about buying a controller for the console.

Ironically I've seen people argue for consoles because they don't have a lot of money but then proceed buy the other consoles which means the price doesn't matter to them.

You are also forgetting the fact that you can literally get the best consumer PC hardware on the planet for a mere $1700 (i7 6700k + 980 Ti + nice case/mainboard/psu + 512GB SSD + 32GB RAM).


> I bought my midrange desktop including monitor and peripherals for less than $900 three years ago and yet from a performance perspective it's still 30% faster than the PS4.

In raw computing power perhaps but consoles have an advantage over the PC due to the fact that developers will optimize their games heavily and specifically for a console.

> Somehow that PS4 lasts 10 years but the PC has to be upgraded every year

Not sure where you're getting this from. I said that the PS4 would be upgraded 3 times over 10 years (i.e. the console lasts 3 years) and that the PC's GPU would be upgraded every 2-3 years, with other upgrades like CPU, RAM etc. coming around every 4 years. I think that's fair given that a mid range GPU from 2013 can barely meet the minimum requirements of some of the latest games.

> You are also forgetting the fact that you can literally get the best consumer PC hardware on the planet for a mere $1700

$1700 is not a trivial amount of money. Sure you could get a beast of a PC for that price. You could also buy a PS4 and 23 new release games (or say 3 new release games for $60 and 30 older games for $40). I think selling the PC is pretty hard with those kinds of numbers.


If you check Amazon right now Witcher 3 is 41$ for PS 4, and 36$ on PC. Steam sales also allow you to buy an endless stream of AAA games for cheap.

Yes, Witcher 3 is demanding for your older PC. But, plenty of people don't buy a new console the year it comes out either. PC gamers can always push off an upgrade and wait to play any overly demanding game(s).

PS: It's also a good idea to stay on the console upgrade cycle. A lot of PC games are cross ports so late in cycle they run really well on older PC's.


Im not exactly sure where you are getting that GTX 660 will barely run stuff I'm on 460 GT and in no rush to upgrade (granted, this is on 22 inch 1680x1050 display, so the load is lighter). I upgraded everything else but the GPU recently and it's only in extreme situations that i wish i had something beefier (think planetside 2, few hundred players having a slaughterfest)


> Im not exactly sure where you are getting that GTX 660 will barely run stuff

I'm taking this from Witcher 3's minimum system requirements. I'm assuming that anyone investing in a gaming system wishes to be able to play new releases as they come out. My point is that a GPU from 2013 is right on the edge of losing support of a game released in 2015, just 2 years after it was launched. Meanwhile, this game only just dropped support for consoles released around 2005.


But the console will have lower quality/lower FPS, we have to compare for the same FPS and resolution.


You're considering the launch date of the console, which is its best price/performance time, but that quickly goes down. A GTX 660 is now around $100. Most people will want some kind of PC anyway, and mainstream CPUs are good enough to handle almost any game. All that most people need for gaming is a GPU update. Although this isn't possible with most laptops, external GPUs are already entering the market.

Console and PC gaming are converging. Microsoft is talking about quicker upgrades because the power of hardware is rapidly evolving. The new GPU generations this year will be a huge leap forward, and there's plenty more to come. They're also introducing cross-platform online play. By the way, PlayStation requires a subscription fee for that too now. It was only the PS3 that allowed free online play.


Will most people want a PC? Phone and tablet sales have been eating into traditional PC sales for nearly a decade now. And the PCs people have still been buying are mostly low powered laptops that won't be able to power a Rift or Vive.


Sales slow because people already have a PC and hold off on upgrades.


Is the install base for high-end PCs (of the level needed to push an Oculus) really larger?

They've sold 36 million PS4s.

An Oculus needs a Nvidia GTX 970 or better graphics card (which goes for $300+ on it's own). I highly doubt that 36 million+ people have purchased one of those (or would, especially on top of the $599 that it costs for the Rift itself).


On the other hand, a console is an easy gift to give, regardless of your technical capabilities, and as a kid, it's easier to endless encourage your parents to get the right thing. Ask for a computer, and you may end up with a chromebook!


You can give a desktop PC easily too.


Easily? Not really, there are more choices in the GPU alone than there are choices in the console (PS or X-Box?).


I'm saying it's not that hard to give away a desktop PC as a gift.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: