While Ren'Py is primarily used with two dimensional rectangular images that are common in visual novels, underneath the hood it has a model-based renderer intended to to take advantage of features found in modern GPUs. This allows for a number of visual effects that would not otherwise be possible.
As a warning, this is one of the most advanced features available in Ren'Py.
In many cases, it's not necessary to understand how model-based rendering
works behind the scenes - features like matrixcolor
and Live2D support
can be used without understanding how Model-Based rendering works, and more
such features will be added to the understanding. This documentation is
intended for very advanced creators, and for developers looking to add
to Ren'Py itself.
As of Ren'Py 7.4 (late 2020), Model-Based rendering needs to be enabled to be used. This is done by setting config.gl2 to True, using:
define config.gl2 = True
config.gl2
= False linkIf true, Ren'Py will default to using a model-based renderer.
As it's expected that model-based rendering will become the only renderer in the near future, the rest of this documentation is written as if model-based rendering is enabled all the time.
Model-Based Rendering is one of the most advanced features in Ren'Py, and this documentation may be hard to understand without first looking at the OpenGL, OpenGL ES, GLSL, and GLSL ES manual. What's more, since there are portions of the models that are passed directly to your GPU drivers, which may accept erroneous inputs, it's important to check on multiple kinds of hardware.
The fundamental thing that Ren'Py draws to the screen is a Model. A model consists of the following things:
As Ren'Py usually draws more than one thing to the screen, it creates a
tree of Render
objects. These Render objects may have Models or
other Renders as children. (A Render object can also be turned into a Model.
as described below.) A Render contains:
Matrix
that describes how the children are transformed in
three-dimensional space.Ren'Py draws the screen by performing a depth-first walk through the tree of Renders, until a Model is encountered. During this walk, Ren'Py updates a matrix transforming the location of the Model, a clipping polygon, and lists of shader parts, uniforms, and gl properties. When a Model is encountered as part of this walk, the appropriate shader program is activated on the GPU, all information is transferred, and a drawing operation occurs.
Ren'Py creates Models automatically as part of its normal operation. The main reason to understand where models are created is that models correspond to drawing operations, and hence are the units that shaders are applied to.
Solid()
u_renpy_solid_color
uniform.Dissolve()
, ImageDissolve()
, AlphaDissolve()
, Pixellate()
, AlphaMask()
, Flatten()
Transform()
and ATLA Transform creates a model if mesh
is true, or if blur
is being used. In this case, the children of the Transform are rendered
to textures, with the mesh of the first texture being used for the mesh
associated with the model.
Not every transform creates a Model. Some transforms will simply add
shaders and uniforms to a Render (such as transforms that use
blur
or alpha
). Other transforms simply affect
geometry.
Render
mesh
attribute is True.
is being used. In this case, the children of the Render are rendered
to textures, with the mesh of the first texture being used for
the mesh associated with the model.It's expected that Ren'Py will add more ways of creating models in the future.
Ren'Py generates a shader program by first assembling a list of shader part names. This list consists of "renpy.geometry", the list of shader parts taken from Renders, and the list of shader parts found in the Model being drawn.
The shader parts are then deduplicated. If a shader part begins with "-", it is removed from the list, as is the rest of that part without the leading "-". (So "-renpy.geometry" will cause itself and "renpy.geometry" to be removed.)
Ren'Py then takes the list of shader parts, and retrieves lists of variables, functions, vertex shade parts, and fragment shader parts. These are, in turn, used to generate the source code for shaders, with the parts of the vertex and fragement shaders being included in low-number to high-number priority order.
Ren'Py keeps a cache of all combinations of shader parts that have ever been used in game/cache/shaders.txt, and loads them at startup. If major changes in shader use occur, this file should be edited or deleted so it can be re-created with valid data.
New shader parts can be created by calling the renpy.register_shader function and supplying portions of GLSL shaders.
Generally, shader parts should be of the form "namespace.part", such as "mygame.recolor" or "mylibrary.warp". Names beginning with "renpy." or "live2d." are reserved for Ren'Py, as are names beginning with _.
renpy.
register_shader
(name, **kwargs) linkThis registers a shader part. This takes name, and then keyword arguments.
The variables used by the shader part. These should be listed one per line, a storage (uniform, attribute, or varying) followed by a type, name, and semicolon. For example:
variables='''
uniform sampler2D tex0;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
'''
Other keyword arguments should start with vertex_
or fragment_
,
and end with an integer priority. So "fragment_200" or "vertex_300". These
give text that's placed in the appropriate shader at the given priority,
with lower priority numbers inserted before higher priority numbers.
Ren'Py supports only the following variable types:
Matrix
)Uniform variables should begin with u_, attributes with a_, and varying variables with v_. Names starting with u_renpy_, a_renpy, and v_renpy are reserved, as as the standard variables given below.
As a general sketch for priority levels, priority 100 sets up geometry, priority 200 determines the initial fragment color (gl_FragColor), and higher-numbered priorities can apply effects to alter that color.
Here's an example of a custom shader part that applies a gradient across each model it is used to render:
init python:
renpy.register_shader("example.gradient", variables="""
uniform vec4 u_gradient_left;
uniform vec4 u_gradient_right;
uniform vec2 u_model_size;
varying float v_gradient_done;
""", vertex_300="""
v_gradient_done = a_position.x / u_model_size.x;
""", fragment_300="""
gl_FragColor *= mix(u_gradient_left, u_gradient_right, v_gradient_done);
""")
The custom shader can then be applied using a transform:
transform gradient:
shader "example.gradient"
u_gradient_left (1.0, 0.0, 0.0, 1.0)
u_gradient_right (0.0, 0.0, 1.0, 1.0)
show eileen happy at gradient
Model-Based rendering adds the following properties to ATL and Transform()
:
mesh
linkType: | None or True or tuple |
---|---|
Default: | None |
If not None, this Transform will be rendered as a model. This means:
shader
linkType: | None or str or list of str |
---|---|
Default: | None |
If not None, a shader part name or list of shader part names that will be applied to the this Render (if a Model is created) or the Models reached through this Render.
In addition, uniforms that start with u_ and not u_renpy are made available as Transform properties. GL properties are made available as transform properties starting with gl_. For example, the color_mask property is made available as gl_color_mask.
The following uniforms are made available to all Models.
vec2 u_model_size
vec2 u_lod_bias
mat4 u_transform
float u_time
vec4 u_random
sampler2D tex0
, sampler2D tex1
, sampler2D tex2
vec2 res0
, vec2 res1
, vec2 res2
The following attributes are available to all models:
vec4 a_position
If textures are available, so is the following attribute:
vec2 a_tex_coord
GL properties change the global state of OpenGL, or the Model-Based renderer.
color_masks
Variables:
uniform mat4 u_transform;
attribute vec4 a_position;
Vertex shader:
gl_Position = u_transform * a_position;
Variables:
uniform sampler2D tex0;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
uniform float u_renpy_blur_log2;
Vertex shader:
v_tex_coord = a_tex_coord;
Fragment shader:
gl_FragColor = vec4(0.);
float renpy_blur_norm = 0.;
for (float i = 0.; i < u_renpy_blur_log2 + 5.; i += 1.) {
float renpy_blur_weight = exp(-0.5 * pow(u_renpy_blur_log2 - i, 2.));
gl_FragColor += renpy_blur_weight * texture2D(tex0, v_tex_coord.xy, i);
renpy_blur_norm += renpy_blur_weight;
}
gl_FragColor /= renpy_blur_norm;
Variables:
uniform float u_lod_bias;
uniform sampler2D tex0;
uniform sampler2D tex1;
uniform float u_renpy_dissolve;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
Vertex shader:
v_tex_coord = a_tex_coord;
Fragment shader:
vec4 color0 = texture2D(tex0, v_tex_coord.st, u_lod_bias);
vec4 color1 = texture2D(tex1, v_tex_coord.st, u_lod_bias);
gl_FragColor = mix(color0, color1, u_renpy_dissolve);
Variables:
uniform float u_lod_bias;
uniform sampler2D tex0;
uniform sampler2D tex1;
uniform sampler2D tex2;
uniform float u_renpy_dissolve_offset;
uniform float u_renpy_dissolve_multiplier;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
Vertex shader:
v_tex_coord = a_tex_coord;
Fragment shader:
vec4 color0 = texture2D(tex0, v_tex_coord.st, u_lod_bias);
vec4 color1 = texture2D(tex1, v_tex_coord.st, u_lod_bias);
vec4 color2 = texture2D(tex2, v_tex_coord.st, u_lod_bias);
float a = clamp((color0.a + u_renpy_dissolve_offset) * u_renpy_dissolve_multiplier, 0.0, 1.0);
gl_FragColor = mix(color1, color2, a);
Variables:
uniform vec4 u_renpy_solid_color;
Fragment shader:
gl_FragColor = u_renpy_solid_color;
Variables:
uniform float u_lod_bias;
uniform sampler2D tex0;
attribute vec2 a_tex_coord;
varying vec2 v_tex_coord;
Vertex shader:
v_tex_coord = a_tex_coord;
Fragment shader:
gl_FragColor = texture2D(tex0, v_tex_coord.xy, u_lod_bias);
Variables:
uniform mat4 u_renpy_matrixcolor;
Fragment shader:
gl_FragColor = u_renpy_matrixcolor * gl_FragColor;
Variables:
uniform float u_renpy_alpha;
uniform float u_renpy_over;
Fragment shader:
gl_FragColor = gl_FragColor * vec4(u_renpy_alpha, u_renpy_alpha, u_renpy_alpha, u_renpy_alpha * u_renpy_over);