Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I actually think being tied to a PS4 is a disadvantage, not an advantage, and a large one at that. The install base of high-end PCs is larger, and upgrading a PC is easier to justify than buying a game console. And for most people, the living room isn't the right place; when you can't see your surroundings, you'd rather be somewhere private.


> The install base of high-end PCs is larger

I'm not sure that's true, although it depends on what you define as "high-end". Steam, as of last February, has 125m active users [0]. According to their hardware survey [1], only 30% of those users' GPUs support DirectX above version 9, with the overwhelming majority of "GPUs" being integrated and not up to the demands of 1080p gaming (a cursory glance suggests we can cut that 30% in half at least).

Given all that, you have ~15% of 125m for a maximum of ~19m "high-end" PCs, which is substantially smaller than the current number of PS4s.

> and upgrading a PC is easier to justify than buying a game console

A GTX 970, the minimum required for Oculus, is around $300 on Amazon. That's not withstanding the extra CPU and RAM upgrades that are likely necessary (per [1]) to support the demands of modern PC games.

There's also the fact that PC games tend to require more and more hardware upgrades as time goes on, while console games tend to use the hardware they have more efficiently. A $350 console can be expected to last 5-10 years without need for an upgrade. A PC needs upgrading every ~3 to keep up to date with modern requirements.

I say all this as someone who exclusively games on PC, don't get me wrong. It's great but it's not the best financial decision for the mass market.

[0]: https://www.vg247.com/2015/02/24/steam-has-over-125-million-...

[1]: http://store.steampowered.com/hwsurvey


Where did you get that so many GPUs are integrated?

> A $350 console can be expected to last 5-10 years without need for an upgrade. A PC needs upgrading every ~3 to keep up to date with modern requirements.

But will they be able to run games at similar quality and FPS? Many console games run at 30 fps.

http://www.extremetech.com/gaming/206070-the-witcher-3-cant-...


> Where did you get that so many GPUs are integrated?

The stats are broken down further here: http://store.steampowered.com/hwsurvey/videocard/

You can see that Intel integrated modules make up a sizable portion (~45% of DX11) of most sections, with much of the remainder after that being the kind of GPU you see built into the motherboard on low end PCs.

And after that there are a lot of dated cards like the 9500 GT.

> But will they be able to run games at similar quality and FPS? Many console games run at 30 fps.

That's a good question. I think it's possible though. Developers managed quite a lot on the PS3 and Xbox at 1080p and 30fps. I think that framerate just wasn't a major concern for them at the time. Now they have both new hardware and a strong consumer interest in VR, I think they'll be able to manage it.


A 1000$ PC will probably be able to run games at 1080p at ~60 FPS while the console will run at 30 FPS. One must compare the same games, in which case one will find that the FPS and resolution in consoles is less. So then the comparison is between paying more for better resolution and/or FPS vs paying less for lower resolution and/or FPS.

Also, if you don't upgrade your console, you won't get to play newer games, but if you upgrade your PC every 3 years or so, you'll be able to play newer games, so that isn't a correct comparison. You can always play your old games on your PC forever, without needing to upgrade anything.


Consoles are on a 6 year upgrade cycle, if you wait 10 years your going to be skipping a generation. PS 1994, 2000, 2006, 2013.

That's still probably faster than PC as a 1000$ gaming PC from 2012 would have 4 or 8 GB ram, 1TB HDD or a small SSD, ~570 GTX 2500k CPU. Which can still play new games but your going to want an upgrade soon ish.

Cheaper games, no subscription fees (X box live), and your going to want a PC anyway means consoles don't really save you money.


You're right about the upgrade cycle being shorter than I thought.

> That's still probably faster than PC as a 1000$ gaming PC from 2012 would have 4 or 8 GB ram, 1TB HDD or a small SSD, ~570 GTX 2500k CPU. Which can still play new games but your going to want an upgrade soon ish.

I don't think this is true. In 2013 [0], for $1000, you'd be looking at a GTX 660 and an i5 4430. Witcher 3, one of the largest PC gaming releases of 2015, required [1] a GTX 660 as a __minimum__, just 2 years after you'd have bought that PC. And this is a card that could handle Skyrim on Ultra settings [2] when you bought it.

To keep a PC up to date (playing new games at the quality settings you used when you bought it) you're looking at $1000 initial cost, ~$150/yr for the GPU and $50-100/year for a CPU/RAM upgrade every few years.

So for 10 years of ownership, you're looking at $1000 for the initial purchase, 3-5 $250 GPU upgrades ($750-$1250) and say two major overhauls of CPU/RAM/Motherboard/PSU ($500-$1000). Total cost of ownership is ~$2250-3250, depending how long you stretch it.

For a console you're looking at say 3 console purchases (since you keep them longer, at least one is likely to need replacing at some stage) each at ~$400. Total cost of ownership is ~$1200.

> Cheaper games

Tom Clancy's The Division, the most recent major new release I'm aware of is currently $60 on every platform [3]. This has been my experience with most major games. If you're commenting on indie games, those are available on consoles too.

> no subscription fees (X box live)

A concern exclusively for Xbox owners. This thread is regarding a Playstation announcement.

> and your going to want a PC anyway

Most people can (and do) get along perfectly fine with a cheap $300 laptop which lasts them 2-5 years depending on the people.

I get where you're coming from but I think you're biased by your personal preferences/situation/experiences. Gaming PCs just aren't more cost effective than consoles unless you happen to need a beast of a PC for other purposes (like software development). They just don't make sense for most of the market.

[0]: https://web.archive.org/web/20130501090947/http://www.logica...

[1]: http://www.pcgamer.com/the-witcher-3-system-requirements-ann...

[2]: http://www.tomshardware.com/reviews/geforce-gtx-660-geforce-...

[3]: http://www.amazon.com/Tom-Clancys-Division-Xbox-One/dp/B00DD...


I'm not where you've got your numbers from. I bought my midrange desktop including monitor and peripherals for less than $900 three years ago and yet from a performance perspective it's still 30% faster than the PS4. Somehow that PS4 lasts 10 years but the PC has to be upgraded every year? I merely bought a new mouse and an SSD but you could probably say the same about buying a controller for the console.

Ironically I've seen people argue for consoles because they don't have a lot of money but then proceed buy the other consoles which means the price doesn't matter to them.

You are also forgetting the fact that you can literally get the best consumer PC hardware on the planet for a mere $1700 (i7 6700k + 980 Ti + nice case/mainboard/psu + 512GB SSD + 32GB RAM).


> I bought my midrange desktop including monitor and peripherals for less than $900 three years ago and yet from a performance perspective it's still 30% faster than the PS4.

In raw computing power perhaps but consoles have an advantage over the PC due to the fact that developers will optimize their games heavily and specifically for a console.

> Somehow that PS4 lasts 10 years but the PC has to be upgraded every year

Not sure where you're getting this from. I said that the PS4 would be upgraded 3 times over 10 years (i.e. the console lasts 3 years) and that the PC's GPU would be upgraded every 2-3 years, with other upgrades like CPU, RAM etc. coming around every 4 years. I think that's fair given that a mid range GPU from 2013 can barely meet the minimum requirements of some of the latest games.

> You are also forgetting the fact that you can literally get the best consumer PC hardware on the planet for a mere $1700

$1700 is not a trivial amount of money. Sure you could get a beast of a PC for that price. You could also buy a PS4 and 23 new release games (or say 3 new release games for $60 and 30 older games for $40). I think selling the PC is pretty hard with those kinds of numbers.


If you check Amazon right now Witcher 3 is 41$ for PS 4, and 36$ on PC. Steam sales also allow you to buy an endless stream of AAA games for cheap.

Yes, Witcher 3 is demanding for your older PC. But, plenty of people don't buy a new console the year it comes out either. PC gamers can always push off an upgrade and wait to play any overly demanding game(s).

PS: It's also a good idea to stay on the console upgrade cycle. A lot of PC games are cross ports so late in cycle they run really well on older PC's.


Im not exactly sure where you are getting that GTX 660 will barely run stuff I'm on 460 GT and in no rush to upgrade (granted, this is on 22 inch 1680x1050 display, so the load is lighter). I upgraded everything else but the GPU recently and it's only in extreme situations that i wish i had something beefier (think planetside 2, few hundred players having a slaughterfest)


> Im not exactly sure where you are getting that GTX 660 will barely run stuff

I'm taking this from Witcher 3's minimum system requirements. I'm assuming that anyone investing in a gaming system wishes to be able to play new releases as they come out. My point is that a GPU from 2013 is right on the edge of losing support of a game released in 2015, just 2 years after it was launched. Meanwhile, this game only just dropped support for consoles released around 2005.


But the console will have lower quality/lower FPS, we have to compare for the same FPS and resolution.


You're considering the launch date of the console, which is its best price/performance time, but that quickly goes down. A GTX 660 is now around $100. Most people will want some kind of PC anyway, and mainstream CPUs are good enough to handle almost any game. All that most people need for gaming is a GPU update. Although this isn't possible with most laptops, external GPUs are already entering the market.

Console and PC gaming are converging. Microsoft is talking about quicker upgrades because the power of hardware is rapidly evolving. The new GPU generations this year will be a huge leap forward, and there's plenty more to come. They're also introducing cross-platform online play. By the way, PlayStation requires a subscription fee for that too now. It was only the PS3 that allowed free online play.


Will most people want a PC? Phone and tablet sales have been eating into traditional PC sales for nearly a decade now. And the PCs people have still been buying are mostly low powered laptops that won't be able to power a Rift or Vive.


Sales slow because people already have a PC and hold off on upgrades.


Is the install base for high-end PCs (of the level needed to push an Oculus) really larger?

They've sold 36 million PS4s.

An Oculus needs a Nvidia GTX 970 or better graphics card (which goes for $300+ on it's own). I highly doubt that 36 million+ people have purchased one of those (or would, especially on top of the $599 that it costs for the Rift itself).


On the other hand, a console is an easy gift to give, regardless of your technical capabilities, and as a kid, it's easier to endless encourage your parents to get the right thing. Ask for a computer, and you may end up with a chromebook!


You can give a desktop PC easily too.


Easily? Not really, there are more choices in the GPU alone than there are choices in the console (PS or X-Box?).


I'm saying it's not that hard to give away a desktop PC as a gift.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: