The bigger issue is that we're in a weird transition time when DDR3 is being phased out, so prices aren't going to get any lower via economies of scale, since production will only decline. DDR4, meanwhile, still has quite a premium on it and adoption will be slow since an X99 motherboard and compatible CPU are similarly expensive. It's going to be a while.
Still, you can get away with a weaker processor and less RAM since games generally are not constricted by either. The only games that actually push system hardware, the Witchers, Metros, and Crysi of the industry are few and far between and there's such a huge gap between them and everything else that people actually play on their PCs. Honestly, one might be okay for multiplatform games with one of the weaker systems we're talking about, since the consoles are running tablet CPUs and have very little memory to allocate to the system.
That said, once developers start really nailing optimization with their second or third game, you're going to need considerably higher hardware to match console performance. John Carmack once infamously stated that PCs need roughly double the specs of consoles to hit similar benchmarks. That's why I think there is still some validity to your original impressions, Vyk, because if someone builds a cost-cutting PC, they open themselves to being left behind or having to make several interdependent upgrades rather quickly. If someone really wants to surpass consoles, maxing out games and going beyond 1080p or 60Hz (or both), you really have to commit on a $1k+ machine that will hold up on its own for a few years, and then a few years after that via easy upgrades and overclocking.
At least that's why I did