Hvad kan vi forvente af physx??

Grafikkort d.  10. august. 2008, skrevet af MadsH
Vist: 908 gange.

MadsH
 
Elitebruger
Tilføjet:
10-08-2008 10:56:17
Svar/Indlæg:
1915/57
Tja det lyder ret spænende den artikel her fra guru3D: http://www.guru3d.com/article/...

Så jeg må sige at jeg glæder mig til d.12 når der kommer Physx pack fra Nvidia med nye drivererWHQL forhåbentlig da beta'e n er ude). Og så kan der køres MultiGPU-mode 🙂



Final thoughts
So then, I have to admit to like what I tested today. Overall gaming with PhysX adds a much more immersive experience to gaming. NVIDIA implementation as it is right now is downright good, as they offer combinations. If your graphics card is too slow to handle both graphics + PhysX .. you can always add a cheap GeForce 8 class card and run PhysX on that. If you have a more high-end product, chances are pretty good that you can run both graphics+PhysX over the one graphics card really well with titles like GRAW2, even in the highest resolutions. And power consumption wise, that's the best way to go, honestly.

Me personally, I just really like the combo of my GeForce GTX 280 + and additional GeForce 9600GT dedicated to PhysX processing. Bitching high framerates versus a much more immersive gaming experience. It's good, really good and certainly doesn't disappoint. But the choices offered are the deal-maker for me, I respect that.

Expand the horizon a little bit more .. you think about it; with a GeForce 8200 chipset integrated mainboard (available already), you could run PhysX over the mainboard's integrated GPU, and just add a dedicated GPU (graphics card) for gaming, and you have the best of both worlds combined with very acceptable power energy levels.

The enigma however remains that though a userbase of 70 million potential GPUs can open up broader game support, it's still based on CUDA. And though CUDA is an open standard, it's fixed at GeForce graphics cards. ATI surely won't use it, neither will Intel or even S3. So a large chunk of the market (read ATI users) can not access this technology. However I do see an interesting option to use a GeForce card solely for PhysX, and maybe even a Radeon card to render the games. Unless NVIDIA will prevent this in it's drivers of course .. from my point of view, there is no obstruction for that construction.

Fact is, for PhysX to succeed the software industry will need to see broad support spread out over all vendors. In roughly 18 months DirectX 11 will be released. And that's where everything will change for sure as it'll introduce a new shader; Compute Shaders. Can you already guess what that is? 🙂 Until that time-frame, game developers who will support GeForce PhysX will likely do this in a very limited quantity. Hey -- additional programming eats up resources, and since developing a good game these days already can cost 40 million USD, it will be a challenge for NVIDIA to get that message across. But Compute Shaders and PhysX computing over the GPU now reached a level where it can't be ignored. And if you guys dig the technology .. let us know. Write about it in the forums, spread the word. If more and more people get interested, the game industry will inevitably notice it and support it. It's a question of demand guys, nothing else. Looking at this from a software-developer point of view, shit man, I'd take the plunge and thus opportunity. You are given a set of tools to do unique things in games. Features that the competition does not have. So my plea to all software houses, think about that thesis for a minute, the crowd (the gaming community) would go wild.

What about the CPU then? Computing PhysX over the CPU is the conservative traditional way. Fact is with a fast floating point processor like any GPU really is, the GPU can compute that physics data so much faster compared to the CPU. Think 10x, maybe even 20x faster. This is why we used a fairly average GeForce 9600 GT today. And as you see, it absolutely had no issues with the settings we threw at it.

We mentioned it broadly, and it's the basis for this article. NVIDIA is pretty keen on gaining support for getting PhysX support, and is going to introduce several free to download PhysX packs for games.

The proof is in the pudding. NVIDIA will release it's PhysX Pack #1 next Tuesday, August the 12th. It will contain PhysX-related freebies that will get you guys going on the right track. This is what you'll can expect:

nVIDIA PhysX Pack #1

Full version of Warmonger
Full version of UT III PhysX Mod Pack (Includes three PhysX-specific levels)
Latest Ghost Recon Advanced Warfighter 2 patch (1.05)
First peek at Nurien, an upcoming social networking service, based on UE III
First peek at Metal Knight Zero, an upcoming PhysX-capable title
"The Great Kulu" technology demo
"Fluid" technology demo
Obviously at that date the release of GeForce Forceware 177.79 drivers and the new PhysX drivers 8.07.18 is pending as well.

So in closing, hat's off to NVIDIA for what they did here. The features are good, your options in choosing whatever PhysX solution you want to compute in is wide-spread and then the gaming user experience is just great. If you own a capable card; heck, it's free to even try-out. The introduction of PhysX is additional value, no-matter how you look at it.

PhysX is exciting technology. It tastes good and I want more, yet we need to see some bigger and newer titles supporting it. Two titles with support that I look forward to are Empire: Total War and Mirror's Edge. But for now, just update GRAW2 to v1.5, go and play with PhysX yourself. You'll notice an intense overall difference not just in the old Ageia level, no, through-out the entire game, and your gaming experience will be a good notch more realistic. Heck, if you like the game grab UT3 and install the PhysX mod. Any game in combination with PhysX options; you'll like it for sure. So my recommendation, if you have a CUDA ready class GeForce 8 or newer graphics card. Just download the pack next week and give it a try. Test the games, and have a look at the really incredible PhysX technology demo's like Fluid, Great Kulu and just play around with Nurien as well, these chicks rock hard man.

It's new immersive content that you never saw rendered this way before, and I guaran-frickin-tee that you'll love it.


HuckJam
 
Overclocker
Tilføjet:
10-08-2008 11:33:22
Svar/Indlæg:
36/9
Nu må vi se, håber da ikke at det bliver NVIDIA only og at ATI også får lavet noget tilsvarende.


MadsH
 
Elitebruger
Tilføjet:
10-08-2008 11:45:13
Svar/Indlæg:
1915/57
Efter hvad jeg kan læse flere andre steder,er der ikke nogen hindring for at det ikke kan kører på ATI også,spørgsmålet er vel bare om de er interesseret eller om det er mere interessant for dem at viderudvikle havok.


1EaR
 
Elitebruger
Tilføjet:
10-08-2008 13:07:05
Svar/Indlæg:
5750/124
#1+2 PhysX er et Nvidia firma, så jeg tror næppe at de lader konkurenten tage en del af ens hårde arbejde, ved bare at sende dem en god potion kode. Havok er derimod Intel, og AMD/ATI må gerne bruge det, da det er software Physics (så vidt jeg husker). Nvidia må også bruge Havok


noxon
 
Elitebruger
Tilføjet:
10-08-2008 13:39:32
Svar/Indlæg:
3321/21
Nvidia har tilbudt ATI physx support.


E2X
 
Elitebruger
Tilføjet:
10-08-2008 13:43:22
Svar/Indlæg:
4018/217
Og det er fordi at Nvidia er bange for hvad Intel kommer med. Derfor tilbyder de AMD/ATI Physx så dette bliver det som alle spiludviklerne kommer til at bruge så Intel får problemer i fremtiden når de kommer med deres CPU/GFX løsning.


Tenkin
 
Elitebruger
Tilføjet:
10-08-2008 14:28:02
Svar/Indlæg:
3332/33
men indtil videre har ati/amd takket nej til nvidia.

😀


HoutN
 
Overclocker
Tilføjet:
10-08-2008 16:02:28
Svar/Indlæg:
62/21
Det der + det nye direct11 der kommer.. Spil begynder jo næsten at blive levende nu :)
Tør ikke spille mere, jeg stopper.