LazyCoder, real blur is pixelshader3.0 only.
Printable View
LazyCoder, real blur is pixelshader3.0 only.
Remember that if I compiled some Pixel Shaders in a specific version, it's because it's the minimum version to let it works correctly (I always try to compile a shader with the minimum version). :)
But I have 2 very new video cards. (ATI Radeon HD 3870).Quote:
Originally Posted by Looki
Oh and also, some of the blur ones don't show up: blur, blur borders, and sharpen. (I don't know about the ones that are not in the blur folder though.)
I am making this sticky for a while, (as with the other post).
These packs should be up front and easy to find/test.
Maybe, after they are finalized and we get closer to a final HWA, we can have a link on the sidebar for a shader page, or something like that.
[size:14pt]The file to download (last available version) ;)[/size]
Hi Sphax,
First of all, thanks for making these for us. However I'm having a bit of a problem.
I downloaded the pack and installed it. However I can't seem to find it inside MMF. Any documentation or tutes on how to use this? Is it listed with the other extensions?
Could you, or some one who knows about it, pleas help me?
All shaders go into this folder:
His installer should take care of that though.Quote:
\<Your Multimedia Fusion Folder>\Effects\
The shaders doesn't show up in the extensions list, but you can add them to existing Active Objects (if you run the HWA enabled .exe instead of the normal MMF2 version.)
You add a pixel shader to an object by selecting it in the "Effect" property in your object (where you otherwise would find semitransparant/additive/xor effects and so on)
Hmmm.... from what I can tell it instelled properly. The effects folder is the first place I checked for docs. I still don't know what an HWA is. Duse that mean that I have to have the developers version of MMF2?
Because I only have MMF2 standard.
http://www.clickteam.com/epicenter/ubbthreads.php?ubb=showflat&Number=74202
HWA is short for "HardWare Accelerated".
Yves released a beta patch to allow MMF2 to create games where the graphics are accelerated by your graphics card so you can make much more complex effects and more objects on the screen at one time with almost no slowdown.
To use theese shaders you must install the HWA beta and run the HWA-enabled exe file. You can easily have the non-HWA and the HWA version of MMF2 installed at the same time. Just note that you should back up all your files before you start working on them in the HWA enabled version since it changes the files so that the non-HWA version of MMF2 cannot open the files properly anymore if you use any of the new special features in the beta.
Also, remember that it is a beta, it should only be used for testing :)
Aaaah, so that's what it means. I guess pixel shaiders would be very slow if they just relied on the CPU.
Too bad it's only for testing though, I was planing to use it in a game. Maybe the effects I had wonted to use can be put into the sequel.
Thanks for the help, I really aprishiate it. :)
By saying only for testing I meant that you shouldn't use it for serious commercial stuff or stuff that needs to be stable and so on. Once it is out of beta it should be ready for that. So there is nothing wrong in going ahead to work on your game in the HWA version :)
As long as you keep backups :)Quote:
Originally Posted by Andos
The Bloom2D has a typo in one of the parameters(bloomstrenght)I dont know if you or Looki made this but I just wanted to say...
That's an effect I made and which is a part of the pack...
I'll try to change it without breaking compatibility.
my 7600gt supports ps3.0 but real blur refuses to show anything :(
woops forgot to add, friggen awesome package sphax :D
I have a 7600Go and it works... :)
Maybe you don't use the parameters correctly or don't have the latest drivers?
Real blur never shows for me either, and I have the same graphic card as Mook06.
Not working for me either... 8800 GTS.
I think we got enough evidences now :P
Yeah. :)
I'll re-check this pixel shader. ;)
drivers are up to date, but im not sure if im setting up the parameters correctly :sleep:
I have a really DOH! question that has been botherin gme a while about the new shader system.
There are effects, and there are parameters. They are rather easy to setup in the properties as you simply select an effect from the available list of shaders and then set the parameters there.
Now, when you want to do real-time manipulation of the parameters as actions in your events, you use parameter names that do not coincide with the ones in the Properties. Some of those parameters are, at least, mnemonic, though they might affix a letter, others are just two letters, (one small case, one capital) as in "iX".
Is there a reason for that convention? I know it involves more typing to use something like X center, over iX, but readability and comprehension are an issue. Readability and comprehension are at odds with typing in, maybe?
I am also wondering if, in the future, the shader system will allow users to choose a named parameter for an effect from a list, rather than have to refer to the notes on what can be set during runtime. Like: Set Effect, "Mirror" from the available list, and then find the corresponding effect parameters, (also a get data from object) to use in one expression, with maybe 0 or -1 as meaning no change from the current setting.
I love shaders and they are great, but it seems to be moving away from the original way that MMF2 and most objects work ... a little counter-intuitive, but then, maybe I am just not looking at it correctly or with an open mind.
Is his making sense?
The "easy of use" must be done by Clickteam, not by Pixel-shaders programmers. ;)
Novabrain, that's just for organization. :)
For example iX would be an integer, fCoeff a float.
Anyway, you're right about the last thing, but I think that's up to Yves, isn't it?
He'd need to code that, if I understood you correctly.
Sphaaaaaaaaax. ;_;
Lol Looki. :)Quote:
Originally Posted by Looki
what parameters should i try with Real blur? better yet, why not build an exe with one working on your 7600Go and we can check if it works for us :whistle:
No one here said it works so no one can build a working one.Quote:
Originally Posted by Mook06
threw together a test app
works for me so i'm not sure what's wrong for you guys (unless you're using a "large" coefficient)
That doesnt work for me. Depending on the parameters, I get a grey - white picture. As said above, geforce 8800gts 640mb.
8800GTS 320MB on WinXPx64:Blank. Gray square at some parameters: pic
same thing here :(
ill go check what my older 6800 does
Works on my laptop (ThinkPad T61 with Mobile Intel 965 Express) and XP.
Well uhm, my computer support the latest direct x and pixel shader. So I guess only old PCs can use this pixel shader?
Now that's jumping to conclusions :)
Haha, yeah well, I just bought it like a week ago and it has a 9800 GTX2. Which was kinda like the newest by that time.
edit: by old PCs I mean computers with a lower pixel shader version.
same grey square on my 6800
Well that's strange it don't work anymore for me too... I'll check the code asap. ;)
Works on a Radeon X1950 Pro. DirectX 9.0c(4.09), WinXP. Pretty sure it's a PS 3.0 card.
no worky on fresh install everything on new 8800gt
Hey,
I'm entering this competition called "Gamma3D" and I have been using the "additive Blend" shader and the RGB Coefficient to work with the Red, Blue, and Green channels. I have a problem though, the fact that I am using additive blending means that it will be adding colors regardless of what I want it to do. Is there anyway to make it so it only adds the colors that I choose and doesn't add the colors of things I don't want? For example lets say I have a ball with 2 actives, one red and one blue, and I set them to additive blend, and then I have like a character or something that isn't additive blend. Is there anyway to make the red and blue balls not react to the characters color?
I appreciate your help thanks (also what I am talking about may be called target rendering)
Krim