8/8/08

New 4k coming, Nvidia Zero

Real life has kept me from programming for a long time. Fortunately things are sorting themselves out so I have a new 4k coming up (in conjunction with s!p this time). This one is heavy on the shaders and Pohar has challenged me to get 20 shader pairs into a single 4k. So be it Pohar - 20 shader pairs it is ;-).

However, blogs with no useful stuff in them suck so here goes.

Mentor recently pointed out on Pouet that he used the "noise" function for his brilliant Himalaya 1k. Only noise does not work on graphics cards at hardware speeds. He used it in software! This started me to thinking that it would be possible to pre-render cool textures using shaders in SOFTWARE during the pre-calc of a 4k/64k. The idea was simple, write shaders as normal but use the powerful functions that drop back to software (noise, dfdx etc), have them render to a back buffer slowly during the "loading" section of hte product, capture the texture, and then use the textures in real time shaders later.

A simple example:

const char *vsh="void main(){gl_Position=ftransform();}";
const char *fsh="
void main(){
gl_FragColor= vec4(
sin(30.0*length(noise2(gl_FragCoord.xy*0.01)))
);\
}";


Results in:


Great - finally a way to do interesting textures at 4k in very few bytes (assuming you are using shaders anyway).

Only. No. This is OpenGL, not directx texture rendering. Nothing works the way it should. In this case the correct results (as per the GLSL specification) are achieved on ATi. On Nvidia, the rules break and Noise - the key to these textures - returns a big fat zero. This renders (you see what I did there :-) the whole idea useless.

ATi 1 - Nvidia 0.

So, noise on Nvidia cards returns 0 in GLSL. Sigh.