Hi there, I've noticed that the gpu usage in not constantly at 100% on the slave machine when rendering at higher resolutions. It seems to be related to the amount of data being transmitted over the network. There is pretty much there is a direct correlation, when the gpu usage drops, the network data being transmitted goes up, and vice versa. This can be easily tested with the standalone benchmark file at 5k pixels wide.
Is this because the network speed (1GB) is not enough to cope with the demand?