Shaders¶
In nope.gl, shaders are written in GLSL, but inputs and outputs are
abstracted by the tree. Similarly, #version directive and precision are
controlled outside the shader strings. This means that user-provided shaders
usually only contain functions.
Additionally, various helpers are provided to help writing portable shaders, allowing node trees to be as portable as possible.
Vertex shader¶
Builtin inputs¶
When constructing a graphic pipeline (Draw node), the geometry information
needs to be transmitted to the vertex stage. To achieve that, nope.gl
provides the following inputs to the vertex stage:
Type |
Name |
Description |
|---|---|---|
|
|
geometry vertices, always available |
|
|
geometry uv coordinates, if provided by the geometry |
|
|
geometry normals, if provided by the geometry |
|
|
index of the current vertex |
|
|
instance number of the current primitive in an instanced draw call |
Note: these are commonly referred as attributes or in in GL lexicon.
User inputs¶
To make accessible more vertex attributes, it is possible to add them in
Draw.attributes or Draw.instance_attributes. These attributes will be
set as inputs to the vertex shader. Since this parameter is a dictionary, the
user can access these inputs using the same name used as key parameter.
For example, given the following construct:
center_buffer = ngl.BufferVec3(...)
color_buffer = ngl.BufferVec4(...)
draw = ngl.Draw(geometry, program)
draw.update_attributes(center=center_buffer, color=color_buffer)
The vertex shader will get two extra attributes:
centerof typevec3colorof typevec4
Builtin variables¶
The variables are another type of inputs made accessible to the shader.
nope.gl provides the following as builtin:
Type |
Name |
Stage |
Description |
|---|---|---|---|
|
|
vertex |
modelview matrix |
|
|
vertex |
projection matrix |
|
|
vertex |
normal matrix |
|
|
fragment |
viewport size |
Note: these are commonly referred as uniforms in GL lexicon.
The symbols will always be available in the vertex shader. The value of these
variables is controlled by a parent Camera node.
A typical vertex shader will do the following computation to obtain the
appropriate vertex position: ngl_projection_matrix * ngl_modelview_matrix * vec4(ngl_position, 1.0).
Similarly, the normal vector is generally obtained using ngl_normal_matrix * ngl_normal.
User variables¶
To make accessible more vertex variables, it is possible to add them in
Draw.vert_resources. Since this parameter is a dictionary, the user can
access these inputs using the same name used as key parameter.
Output position¶
To output the vertex position from the vertex shader, ngl_out_pos should be
used.
Note: this is commonly referred as gl_Position in GL lexicon.
Outputs (to fragment)¶
To transmit information from the vertex stage to the fragment stage, the
outputs (vertex side) and inputs (fragment side) need to be declared as
IO* nodes in Program.vert_out_vars.
Note: these are commonly referred as in/out or varying in GL lexicon.
Fragment shader¶
Inputs¶
No builtin inputs are provided by the engine. User variables (see next section)
and vertex outputs declared using Program.vert_out_vars are the only
accessible inputs.
User variables¶
To make accessible fragment variables, it is possible to add them in
Draw.frag_resources. Since this parameter is a dictionary, the user can
access these inputs using the same name used as key parameter.
Output color¶
To output the color from the fragment shader, ngl_out_color should be used.
Note: this is commonly referred as gl_FragColor in GL lexicon.
In some cases (typically to render to a cube map), the user may want to output
more than one color. The Program.nb_frag_output parameter exists for this
purpose.
Textures¶
Just like variables, textures are considered resources for the different
stages of the pipeline. This means they need to be added to
Draw.vert_resources, Draw.frag_resources or Compute.resources to be
respectively made accessible to the vertex stage, fragment stage or compute
stage.
Textures need special care in nope.gl, mainly because of the hardware
acceleration abstraction provided by the engine for multimedia files.
A Texture2D can have different types of source set to its data_src
parameter: it can be arbitrary user CPU data (typically Buffer* nodes) as
well as video frames (using the Media node).
To sample a Texture2D from within the shader, multiple options are available:
ngl_texvideo(name, coords): this picking method is safe for any type ofdata_src, but it is recommended to only use it withMedia. The reason for this is that it adds a small overhead by checking the image layout in the shader: in the case of a media, that format can change dynamically due to various video decoding fall-back mechanisms. On the other hand, it is the only way to benefit from video decoding accelerations (external samplers on Android, VAAPI on Linux, etc).texture(name, coords): this picking method should be used if and only if thedata_srcis not aMedia. While it may work sometimes with aMedia, it definitely won’t if the video gets accelerated. The only safe way to use this picking method with aMediais to haveTexture2D.direct_renderingdisabled, but that may cause a costly intermediate conversion. If thedata_srcpoints to a user CPU node, then it is the recommended method.imageLoad(name, icoords): if pixel accurate picking is needed, this method can be used.icoordsneeds to be integer coordinates. To use this method, it is required to indicate to the program that the texture needs to be exposed as an image. To achieve this, aResourcePropsnode withas_image=Truemust be set to the programproperties, using the name of the texture as key. Note that a texture can be accessed as a sampler from a program but as an image from another.
Other variables related to textures are exposed to different stages of the shader:
Availability |
Type |
Name |
Description |
|---|---|---|---|
|
|
|
uv transformation matrix of the texture associated with the draw node using key |
|
|
|
dimensions in pixels of the texture associated with the draw node using key |
|
|
|
timestamp generated by the texture data source, 0.0f for images and buffers, frame timestamp for audios and videos. This variable is exposed in the same stage as the texture. |
Blocks¶
When using blocks as pipeline resources, their fields can be accessed within
the shaders using <block>.<field-label> where <block> is the key in the
resource nodes dictionary and <field-label> is the label of the field node.
For example, given the following construct:
block = ngl.Block()
block.add_fields([
UniformFloat(3.0, label='x'),
BufferVec4(256, label='data'),
])
draw = Draw(geometry, program)
draw.update_frag_resources(blk=block)
A block with an instance name blk will be declared in the fragment shader, so
the first and second fields respectively be accessed using blk.x and
blk.data.