I have a render slave with a pair of GTX 1080s in it on my local network. I'm connected via a gigabit ethernet connection and have verified a transfer speed of about 120 megabytes a second between the machines. For several weeks now, when I try to use the slave to help with a render, it doesn't seem to send any data to the master renderer.
Everything shows up fine on my master machine, the data gets sent to the slave, the info shows the slave is running and the GPU's on the slave also get cranked up to 100% utilization, but I get no benefit on my master machine. In fact, my renders are slightly slower when I am using the slave vs turning it off. Also, whenever I try to render an animation, the master renderer hangs up between 80 and 99% completion on the first frame and does not proceed to the next frame.
I'm running 3.06.2 on both systems. I've tried turning off all of my firewalls and my antivirus to no avail.
Anyone know what's going on with the slave? I'll attach a screenshot which shows it via remote desktop. AS you can see in the screenshot, all 5 GPUs are being utilized, but the slave command prompt isn't showing any data being processed.