Most LUTs in RV are interpolated in hardware for performance, and generally the shaders use either the default tri-linear sampling provided by GL, or a GLSL shader that does tri-linear interpolation in 32-bit float.
Generally this provides a close approximation of various software interpolation methods, but of course there can be differences, and some are more visible than others. If you see artifacts, our first suggestion is to increase the size of the LUT (most modern cards can handle 64-cubed LUTs, and 128-cubed is OK for some). But in some cases there are visible artifacts even with large LUTs.
Using tetrahedral interpolation rather than tri-linear would certainly improve fidelity, though there would also be a performance hit.
Should we add at least an option for tetrahedral interpolation for 3D LUTs ? Vote below !