I'm doubtful to be honest. Crytek will need to seriously overhaul how the game is produced to still be able to look as good and run better on lower end systems. Big performance gains from game engines like those don't come easy. Even the HD4870X2 can only max the game smoothly at 1920x1200, let alone 2560x1600, and that's without AA too. Meanwhile other similarly good looking games like Episode Two and Call of Duty 4 are rocking nearly three figure frame rates at 2560x1600, and with AA on the same hardware. The game's a vast system hog.
That only further supports my point that Crysis is not realistic on integrated graphics. You already need a monster system to run it on all high let alone all very high. I can see all medium at 1024 res with an X1950Pro or a 7900GS, but there's really no way integrated could do it, especially anywhere near 40FPS. It just doesn't have the flexibility or power of discrete cards.
it don't matter how many cores you have... yes you maybe able to run the graphics processing through your cpu but it still isn't enough...
Lmao - software rendering is even slower than integrated graphics - the 3dmark06 CPU test should be proof of that.
0.7 and 1.3 for the two tests for me. I think I know someone who in the latter test got around 3.4 ish for the second test, but that was with significant overvolting on a Q9450, not a practical real world result, on a more reasonable voltage though, him and someone else got around 3fps on the second test... lmao. 3fps being good, what a joke. it even looks appalling too.
Yeah I hover around 1-2FPS all the way through the test. But then so did my 4400+ with the X1800XT. CPU speed DOES affect scores in 06. But we're talking a 9600GT getting 9500 on an AMD X2 and 10500 on a Core 2 Duo. Any decent dual core is adequate for any game. The only games that saw an improvement from OC'ing were Crysis(very minor) and Half Life 2(during heavy physics). Saying you can run Crysis on an Intel integrated because you have a quad is like saying a Geo Metro can outrun a Kawasaki Ninja because it has more wheels.
lol very true. As for the CPU performance, 3dmar is ridiculous in that regard. The same graphics setup (stock HD3870s in crossfire) got 13801 on my 3.15Ghz Core 2 Duo, and 20,500 on a 4Ghz Q6600, yet in real world testing in Crysis, they performed pretty much the same.
Quake 4 as medium settings was the last playable game on the X3100 intergrated series... I know from playing it... I laughed my @$$ off when he said he got 30fps on crysis... He'd be lucky to get 10 in a dark corner on low settings considering 20 was the average in Q4...
Exactly what I mean. Quake 4 isn't even that demanding of a game. And I don't consider 20FPS smooth for any first-person shooter. Crysis aside, the X3100 suffers in most games. Even the 6600GT could pull smooth FPS in Quake 4 in high settings. Even so, try the 6600GT in Crysis and watch the slideshow at lowest settings. I'm not saying the X3100 is terrible. As engage16 has shown, it can do Half-Life 2 at fairly decent settings. It's definitely a big leap from the GMA 950 and Extreme Graphics 2 I suffered with for years. But you can't say it can run Crysis smoothly at any settings and honestly expect us to believe you. Especially at medium settings at 1024 res. We know better.
3dmark isn't the be all and end all by any means, but it does mean something, and when a card scores less than 1000 marks with a decent enough processor, you know gaming is going to be pretty tedious. You can get away with less than 1000 for a fair few games, UT2004, Doom 3, even Quake 4 on low, but Crysis? I wouldn't want less than 4000.
To truly appreciate the game, I wouldn't use a system that scores less than 7000. So you're looking at X1900XT/7900GTX territory. The X1800XT could do medium at 1024 res, but the frame drops were sometimes absolutely grueling. The X1900XT was really that much better in some cases.
Overclocked, and using my 3.15Ghz E4300 my X1900XT scored 6700, so we're looking at X1900XT-X / X1950 territory.
The best last-gen system I've seen run it was 7900GTX SLI with a 6000+. That could do medium-high settings at 1280 res.
I suppose it technically has been 2 generations now. But honestly, other than a few of the very newest(and some poorly coded) games, an X1900XTX or 7900GTX can do high resolution gaming just fine. The technology really isn't that dated yet and I still consider the X1900/7900 series pretty high end. About 90% of the games I own can be maxed on my X1800XT. And others like Assassins Creed and World in Conflict can be played nearly maxed at 1280 res.
My point being, though, the X1900XT is still a very capable card and is probably still great for those with a 1680 x 1050 monitor. When I say maxed, I don't necessarily mean resolution. Very few cards can max games at your resolution Yes, tech is moving forward. But that doesn't make older tech completely obselete. It's still useful.
Well, the X1800XT can also do Bioshock, FEAR, CoD4, CoD2, Episode 2, Assassins Creed, Unreal Tournament 3 and World In Conflict all max or near max at 1280 res. I would consider this a very common resolution for gamers and one that also can deliver fantastic visuals with a small amount of AA. These are all fairly current games that each look awesome in their own right. So I think that anyone with an X1800/7800 series card or similar is probably gaming just fine if they have a 1280 monitor or similar. Even though some of those games are getting on in years, for example FEAR and CoD2, they still make any Xbox 360 and PS3 game look like crap in comparison. Especially FEAR on the 360. It honestly looked like shit on a stick to me, even in HD. We're just lucky that very fast cards can be had for so cheap. If you can't afford a 512MB HD3850, you have no business trying to build a gaming PC. http://www.newegg.com/Product/Product.aspx?Item=N82E16814102715 And I think the 256MB version was a huge mistake. It is seriously crippled in Crysis and other games compared to the 512MB one. You honestly don't need much more power for any game unless you use high resolutions.