I’m having trouble getting the alpha channel to be correctly outputted when I use setContext(image) and shaders to render to a texture. This is a pity because it wastes a quarter of the available output bandwidth as I’m limited to rgb rather than RGBA. I’m guessing this is because premultiplied alpha gets used on images that are used with setContext(). Is there any way around this? it would be really helpful to be able to use the alpha channel in my images as I’m trying to store two vec2s for each point in the texture as part of a multistage calculation. any help that can be offered is much appreciated!