I have a set of OSL nodes I've been working on for this. Distorting UVW space in 3D based on other textures. Would you be able to work with OSL nodes, can you test this?
These are OSL-based projections nodes for now, so you have to create an OSL projection and point it to the script.
The design is different than for distorted mesh UV, based on user feedback.
As these have more controls over the transformations I split them into 3 separate nodes for rotation, scaling and translation.
Each node supports three types of operation: uniform transform, map-based transform and map-based jitter. Jitter randomizes within a range with the jitter value driven by a map (expected in [0..1] in this case).
They also all have a projection input, so you can daisy-chain them.
Unlike distort UV, each axis gets its own map, and the maps are expected to be grayscale (typically either a single number or a noise). This way we don't have to build a color map with other nodes or use channel split/merge just to translate on one axis for example.
For scale there are also controls that work on all 3 axes at once.
It would be great if you or anyone could test this and see if it behaves as you expect or ways to improve it.
Here are the three nodes:
Here is a test ORBX with various examples.
Example of simple distortion:
Example of the node interface: