⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 shadernotes.txt

📁 3D Game Engine Design Source Code非常棒
💻 TXT
📖 第 1 页 / 共 2 页
字号:
Nolan Walker's notes about his experiences adding shader support to Wild
Magic, both for DirectX9 and OpenGL.

------------------------------------------------------------------------------

Just as a warning, this will probably as much explanation as me complaining
about DirectX, explaining bugs that took me a while to track down, or
digressions on implementation concerns.  Maybe that's what you want.  This
will probably be pretty long too, but I figure part of me working was for
me to figure out what was important and distill that knowledge to you.  A lot
of implementation details and notes are in the CgConverter/ReadMe.txt file.
I figure that would also be useful to WildMagic users as well as you, and so
that's where it is.  That might work well as a "How to use shaders with 
WildMagic" document.

To start off, I know you know about shaders in general, but a brief overview.
There are two kinds: vertex shaders and pixel/fragment shaders.  I think
pixel shader comes from DirectX and fragment shader is an OpenGL term.  I use
pixel shader in the code (and usually when I write here), because that term
makes more sense to me in terms of what the program is actually working on,
but that's just my personal opinion.  A shader typically has some number of
inputs and some number of outputs.  Inputs come in two flavors: varying and
uniform.  These terms are pretty Cg specific (uniform is a keyword), but I
think they logically can be used anywhere.  A shader runs multiple times over
a large set of input data (vertices or pixels).  The varying parameters are
the parameters that change per invocation such as the input data of vertices
or of normals.  The uniform constants are constants that do not change over a
set of data such as projection matrices or numerical constants.

Vertex shaders always have a vertex input and a vertex output.  Pixel shaders
always have a color output, but need no inputs.  The pixel shader receives
everything that the vertex shader outputs except for the vertex output.
Vertex shaders are really the more flexible part of the entire pipeline.  Even
the lowest versions have 100 uniform constant registers (the most I ever
used was for a perlin noise table in the vertex noise shader.  That was
maybe about 70?) and a really high instruction limit (for what most people
are doing).  Pixel shaders are much more restrictive.  The lowest versions
have about 8 uniform constant registers.  Earlier versions (ps.1.x) have
severe restrictions on when you can do certain operations (different phases
of the shader) as well as very low instruction limits.  More on this later
when I talk about different shader profiles.

There are several common "tricks" that many people use with pixel shaders.
If there is vertex-specific information that needs to be passed to the pixel
shader (such as view direction, normals, or even interesting transformation
matrices) they tend to get passed in either as color data or as texture
coordinates.  It doesn't matter if your original program didn't use any
texture coordinates, you can pass as many as 8 texture coordinates from the
vertex shader to the pixel shader.  You don't have to send any texture
coordinates through the vertex shader to do this, either.  Another common
thing that people do is to use a texture as a table or a lookup and then use
that result as an input to look up into another texture (ps.1.4 and above).
The charcoal renderer uses this technique to lookup into a table of random
values to approximate getting a random value.  Also, it always is better to
do calculations as early as possible, in general.  If a vertex shader can
calculate a value, the pixel shader shouldn't do it.  If the application can
calculate a value, then the vertex shader shouldn't do it.  Especially if you
need to normalize something, the vertex shader should do that, because the
pixel shader cannot normalize things, except through an approximation.

However, there are advantages to pushing things to the pixel shader, because
the parameters that you pass there are interpolated per pixel.  If you pass
view direction, normals, and lighting information to the pixel shader then
those vectors will be interpolated over the triangle and you can have
per-pixel lighting.

For the most part, DirectX and OpenGL shaders are very similar in their
capabilities.  A shader program is loaded into memory and compiled if
necessary.  Typically the renderer will fail at this point if it cannot
support that particular profile.  Shader versions are typically called
"profiles" (at least in Cg-land).  Then, the renderer is told to use the
shader.  You set some uniform constants.  Then, for both APIs, you do the
same thing that you would to draw normal geometry (using streams for DX and
using array pointers for GL) and all the geometry gets passed automagically
to the shader.  That's really all there is to it.

To use shaders in OpenGL, all you must do is create the shader and enable it,
just like you would generate, create, and enable a texture.  The setup to
make the application send the data to the shader program is the same as you
would do for vertex arrays.  The application gives it pointers to the vertex
(and optionally any other) information that you need in the same way that
you would normally use OpenGL.  All input data goes into a specific register
depending on its type.  Vertex data always goes into the same register.
Normal data always goes in the same register.  This lets the shader program
be more independent of the application which writes it.

DirectX requires that the shader know the exact format of the incoming data.
This includes not only the order of the varying parameters (vertex data,
normals, colors, texture coordinates) but also which ones there are and what
input register in the shader program they should use.  In the case of
WildMagic, this means that the incoming data is formatted exactly how the
shader needs it.  This is required at shader compile/creation time in DX8.1.
It is most likely the case that you can change this at a later time in
DX9.0, but that it will still be necessary.

Digression: One bug I had to fix was that all the text kept getting pixel
shaded as well and so apparently I wasn't turning off shaders when I should
have.  I checked and DirectX disables strangely.  OpenGL, like you might
expect, uses enable and disable calls to enable and disable shaders.  DirectX
on the other hand has no real disable call.  There is a SetVertexShader and
SetPixelShader call.  To unset pixel shaders you have to
SetPixelShader(NULL).  To unset the vertex shader, you have to set the vertex
shader to some sort of stream format that you expect other data to be in,
like D3DFVF_XYZ.  The SetVertexShader call is overloaded very strangely, I
think.

Most shader programs use uniform constants in some way.  From the perspective
of the driver, there are several different ways these can be handled.
DirectX has one global space of constants that it will pass.  If you set
register 3 with data for shader X and then shader Y runs that also needs
information for register 3, it will still be there.  OpenGL divides up its
constants into "local" and "environment" spaces.  Local parameters, as you
might expect, are local to the program themselves (when set they apply to
the current shader--you can't specify which shader they are for.).
Environment parameters can be shared among programs.  The assembly language
programs themselves specify whether any given uniform constant is a local or
an environment parameter.

For WildMagic, the users really need a concept equivalent to OpenGL's local
parameters, regardless of API.  If you are using DirectX and there is a scene
graph with multiple objects that all use constant register 3 (but need a
different value), there is no way for the user to be able to set all of those
within a single render call.  The user could try to munge it by splitting up
the scene graph and rendering things separately, but then they've eliminated
the usefulness of the scene graph itself.  Therefore, WildMagic needs to
emulate the idea of "local parameters".

One of the biggest limitations of DirectX in my opinion is that the driver
will not pass renderer state automatically to shader programs.  OpenGL will
do it for vertex shaders, but not for pixel shaders.  Renderer state is used
in virtually all vertex shaders (to transform from model to clip space).
Thus, the user themselves has to keep track of the current transformation
matrix and pass it as a constant to the renderer.  In the assembly shader
language that OpenGL uses right now, you can refer to state.inverse.mvp.matrix
as a variable and the driver itself will update the constant with the correct
renderer state.  Though, if you want state in pixel shaders (regardless of
API), you have to do it yourself.

For WildMagic, it's not like the users themselves could update this state
manually, because it needs to be updated for every different piece of
geometry using a shader which is all done during a single render call.  Thus,
the renderer itself needs to know what state to update.

The other issue with DirectX (at least 1.x vertex programs) is that numerical
constants cannot be automatically set.  What you have to do is pass them in
as a uniform constant.  OpenGL can define numerical constants from within the
program itself.  Somewhat understandably, pixel shaders from both APIs allow
you to define numerical constants but this is probably due to the fact that
most pixel shaders have VERY few uniform constant registers for you to pass
in numerical constants.

These three reasons (the need for local parameters, updating of state
constants, and numerical constants) are the reason the ShaderConstants class
was created.  This class holds all the information about all the constants
that a shader needs.  It holds both values as well as an enumeration of what
it is (user set, numerical constant, state constant).

Implementation digression: The constants class gets doubly used.  Every
shader object itself has a shader constants class.  This is what I name in
the code "template constants".  Any piece of geometry needs to know what
sort of constants that the shader needs.  The vertex and pixel shader files
that the shaders write out to (in the CgConverter, for example) contain this
template constants class.  However, constants also need to be local to each
piece of geometry, and so whenever you attach a shader to a piece of
geometry, a copy of the constants class from the template instantiation in
the shader gets placed in the geometry object as well for it to change
locally.  This setting happens under the covers and (obviously) invalidates
any constants that were there before.  I mention this in the CgConverter
Readme, but whenever you (re-)attach a shader to a piece of geometry, you
must make sure to set the constants.

A side note: It is interesting to see, (aside from the very minor point of
having state constants in pixel shaders, which is uncommon) that the shader
constants class allows the user to do in DirectX what they could have done
automatically in OpenGL.  If you remember the large amount of inline arrays
and setup that was in the preliminary application we looked at during a
meeting, this was what all the work was for.

There is also a really large need for a high level language that compiles
into multiple API-specific languages.  Besides all the good things that a
high level language gets you, like being able to compile your programs into
future profiles, having automatic optimisation, or being able to compile
into different APIs, it's also just a whole lot easier to write (and debug).
Cg has a nice library of really useful functions which makes it incredibly
easy to do common things like calculate lighting, get sin and cosine values,
or even just normalize a vector.  Not that I couldn't have written things in
assembly language, but with the amount of debugging and fiddling with
programs that I had to do to get them to work, it would have made my work
tenfold to have to try to do it in assembly.  You can also write functions.
They get treated as inline functions rather than true function calls (no
stack on the card, obviously), but it makes writing shaders a lot easier
(and cleaner!).

I picked Cg to write all the shaders in mostly because it's really the only
cross-API high level language that exists at this point.  OpenGL2.0, when
it comes out (eons from now), might also be a choice, but it is likely that
its high level language will only work with OpenGL.  Cg, despite being tied
to NVIDIA, is released free for everyone to use and compiles onto different
APIs and works for different cards.  I suspect that for this next generation
of shader technology (GeForceFX, Radeon 9x00) that it will continue to be
the only high level language that is meant to be able to compile into

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -