http://www.gamespot.com/news/6165103...ort=1#comments
Sony and it's PS3 FINALLY decided to do something right for a change.
Printable View
http://www.gamespot.com/news/6165103...ort=1#comments
Sony and it's PS3 FINALLY decided to do something right for a change.
Worst apostrophe-error ever :p.
And well, sucks for Sony. Good for customers.
Attack the weak point for massive damage.
Maybe I'll be able to afford MGS4 now.
I don't see how they could...the PS3's are expensive to make, they were giving us a deal as is. its $800 something to make one PS3 and they were selling them for $500-$600. that's a $200 loss. We got lucky.
Maybe when it hits $400 I'll consider it.
It'd need to be a price rockslide for me to even concider [sic] buying a PS3 right now. Or maybe an avalanche. One of those things.
Care to quote the article? Can't read it at work.
Quote:
Originally Posted by Emma Boyes
Because Sony obviously were never ever going to think about dropping the price.
I wouldn't get too excited. It might happen soon, it might not. The article doesn't say.
I think it's too soon for them to drop it. Despite how desperate it would look, they would have the ignorant fanbase of people who bought it for $600 just 2 months to deal with. I don't have a problem with them dropping it out of desperation (Sony should be use to the near constant stream of bad press by now), but I think there would definitely be some angry early adopters.
Also, I wonder how much money they make off of the software? I mean, if Sony is losing $400 just by selling the PS3, they gotta sell a *lot* of games to one person before they break even on the cost. I'd be interested to know how much any console-maker makes off of retail games--1st or 3rd party.
was that question even nesisary? The 360 is an old name by now But I'm getting this when I get thye cash fot it ...
http://www.dell.com/content/products...n&s=dhs&~ck=mn
four Nvidia 8800's and 4GB DDR4 Ram......need I go on ;)
Graphics
* 256MB nVidia GeForce 7900 GS, Single and Dual
* 512MB nVidia GeForce 7900 GTX
* Dual ATI Radeon X1950 XTX 512MB CrossFire
* Single 1GB NVIDIA GeForce 7950 GX2 Dual-GPU Graphics Card
* Dual 1GB NVIDIA GeForce 7950 GX2 Dual-GPU Graphics Cards, Quad SLI
Quad SLI doesn't mean four graphics cards, and I can't see them mention an 8800 anywhere.
Can it run MGS4?
And what will you get out of that? Except a flat wallet. Your games would be rendered at a higher FPS than your 75hz LCD monitor could display, and would be capable of resolutions your monitor could never handle. Pointless waste of money imo. If you're thinking of the "games of tomorrow", with their increasingly insane requirements, buy a new graphics card when they actually get out, and save some bucks.
Also, can it run MGS4?
If It came on PC hell yea! It runs at 6Teraflops.
Does it come for the PC?
How much is one teraflop?
No but I can live without a PS3 for a wile. I still have tons of fun of my PS2.
From wikipedia...
FLOPS, GPUs, and game consoles
Very high FLOPS figures are often quoted for inexpensive computer video cards and game consoles.
For example, the Xbox 360 has been announced as having total floating point performance of around one TFLOPS, while the PlayStation 3 has been announced as having a theoretical 2.18 TFLOPS. By comparison, a common AMD A64 or Intel Pentium 4 general-purpose PC would have a FLOPS rating of around ten GFLOPS, if the performance of its CPU alone was considered. The 1 TFLOPS for the Xbox 360 or 2 TFLOPS for the Playstation 3 ratings that were sometimes mentioned regarding the consoles would even appear to class them as supercomputers. These FLOPS figures should be treated with caution, as they are often the product of marketing. The game console figures are often based on total system performance (CPU + GPU). In the extreme case, the TFLOPS figure is primarily derived from the function of the single-purpose texture filtering unit of the GPU. This piece of logic is tasked with doing a weighted average of sometimes hundreds of pixels in a texture during a look-up (particularly when performing a quadrilinear anisotropically filtered fetch from a 3D texture). However, single-purpose hardware can never be included in an honest FLOPS figure.
Still, the programmable pixel pipelines of modern GPUs are capable of a theoretic peak performance that is an order of a magnitude higher than a CPU. An NVIDIA 7800 GTX 512 is capable of around 200 GFLOPS and the current (11/06) NVIDIA 8800 GTX is capable of sustaining 330 GFLOPS. ATI's latest X1900 architecture (2/06) has a claimed performance of 554 GFLOPS[1]. This is possible because 3D graphics operations are a classic example of a highly parallelizable problem which can easily be split between different execution units and pipelines, allowing a high speed gain to be obtained from scaling the number of logic gates while taking advantage of the fact that the cost-efficiency sweet spot of (number of transistors)*frequency currently lies at around 500 MHz. This has to do with the imperfection rate in the manufacturing process, which rises exponentially with frequency. The NVIDIA Quad SLI with two dual GPU GeForce 7950 GX2 (4 GeForce 7950 cards) claims to have up to 6 TFLOPS of computing power.
While CPUs dedicate a few transistors to run at very high frequency in order to process a single thread of execution very quickly, GPUs pack a great deal more transistors running at a low speed because they are designed to simultaneously process a large number of pixels with no requirement that each pixel be completed quickly. Moreover, GPUs are not designed to perform branch operations (IF statements which determine what will be executed based on the value of a piece of data) well. The circuits for this, in particular the circuits for predicting how a program will branch to ready data for it, consume an inordinant number of transistors on a CPU that could be used for FLOPs. Lastly, CPUs access data more unpredictably. This requires them to include an amount of on-chip memory called a cache for quick random access. This cache represents the majority of CPU transistors.
General purpose computing on GPUs is an emerging field which hopes to utilize the vast advantage in raw FLOPS, as well as memory bandwidth, of modern video cards. As an example, occlusion testing in games is often done by rasterizing a piece of geometry and detecting the number of pixels changed in the z buffer, a highly non-optimal technique considering floating point operations. A few applications can even take advantage of the texture fetch unit in computing averages in (1, 2, or 3 dimensional) sorted data for a further boost in performance.
In January 2006, ATI Technologies launched a graphics sub-system that put in excess of 1 TERAFLOPS within the reach of most home users (CrossFire X1900XTXs). To give this achievement perspective, you need to consider that less than 9 years earlier, the US Department of Energy commissioned the world's first TERAFLOPS super computer, ASCI Red, consisting of more than 9,200 Pentium II chips. The original incarnation of this machine used Intel Pentium Pro processors, each clocked at 200 MHz. These were later upgraded to Pentium II OverDrive processors.
But you can't live without that PC? :p. I really don't see what you need it for, except playing *some new game* at a resolution of 4000x4000, which no monitors today can do :p.
Would todays games even be able to fully utilise 4 graphics cards?
Um, no. How is it like saying "can PS2 games ever runs perfectly on PS3?"?
First: more and more PS2 games are running without problems on the PS3.
Second: Old games can run perfectly on that Dell PC even without wasting a gazillion on 4 bleeding-edge graphics cards. Your monitor is still what is going to be the bottleneck :p.
Actually, that page says nothing about what kind of monitor that comes with it. If it's a 19", I doubt it's going to be much more than 1280x1024, 1600x1200 if you're lucky. Furthermore, I doubt it's going to have much more than a 75hz refresh rate, meaning the framerate is physically limited to 75 by your monitor, regardless of anything there is inside the actual computer.
I guess thats good, we in the UK wont see a price drop for some time though, smurfing delays...
I heard rumours they had one planned for us sometime in 2011.