I talked to crazy_stewie
more about
how to workaround the ViewportTexture
does not have mipmaps issue with a shader. My
previous solution (20240621054605), which
emulates mipmaps, is quite slow as it needs to do
two passes in order to determine the colors to
interpolate between based on the LOD.
Instead, we can do a single pass. Instead of
relying on the LOD, we can determine how many texels
we want to sample based on how the UV
value is changing across the surface. I tried doing
this before (20240621214324), but it
probably failed because the shader I wrote was
trying to make way too many samples at once for a
single pixel, causing the GPU to think there was a
hang. This can be addressed by clamping the number
of samples we want to make.
crazy_stewie
wrote a shader which
they shared with me that takes this approach
successfully. I've added comments to it below to
note my understanding of how it works. We have
previously come to an understanding that a pixel is
a pixel on a screen and a texel is a pixel on a
texture and that the terms exist only to reduce
confusion between which kind of pixel we are talking
about. So, I'll be using that definition here.
shader_type spatial;
uniform sampler2D albedo_texture: filter_linear;
sample(sampler2D sampler, vec2 uv) {
vec3 // Project the pixel onto the texture to find out how many texels we want to
// sample. We're approximating the shape of the projection here as a
// parallelogram; in perspective this projection would actually result in a
// different shape, but this is close enough.
vec2 uvdx = dFdx(uv);
vec2 uvdy = dFdy(uv);
vec2 tex_size = vec2(textureSize(sampler, 0));
vec2 extents = vec2(length(uvdx*tex_size), length(uvdy*tex_size));
// In each direction along the parallelogram, we want to sample at minimum
// once and at most 32 times.
extents = clamp(round(extents), vec2(1, 1), vec2(32, 32));
uvdx /= extents.x;
uvdy /= extents.y;
// We want to start sampling from the top-left of the parallelogram, but
// we need to offset this by half of the distance so that the sample points
// are centered on the original UV coordinate. In other words, we want to
// sample the fences and not the fence posts.
uv -= (extents.x*0.5 - 0.5)*uvdx + (extents.y*0.5 - 0.5)*uvdy;
vec3 result = vec3(0);
for (float i = 0.0; i < extents.x; i++) {
for (float j = 0.0; j < extents.y; j++) {
result += texture(sampler, uv + i*uvdx + j*uvdy).rgb;
}
}
return result / (extents.x*extents.y);
}
void fragment() {
ALBEDO = sample(albedo_texture, UV);
}
This results in an image that is less blurry, which improves image quality.