This would definitively require some research to do that. There is currently several ideas that can be investigated, but it will depends a lot on your technologies, and target application (game, realtime, offline rendering, etc...).
In Unreal Engine (or any game engine) you could render specific particles under another Scene Capture 2D hidden somewhere that would render in a render target, then sample this render target in the shader of your mesh.
In the PopcornFX "tool box" you might want to look at the Evolver Projection to stick particle to the mesh surface.
But mostly look at the Script functions available, for example:
- SamplerName.projectPCoords(float3 position) returns the pcoords of the closes surface to position
- SamplerName.sampleTexcoord(int3 pCoords) returns resolve the UV from a pcoord
! "projections" feature does not work with dynamic mesh, once you have set your mesh in the pkfx, projection function will use this one, even with attribute samplers !
You can also combine that using Spatial Layers.
Or/And even play with PopcornFX UE4 Plugin Event Listeners, that can be use to send particle events and data directly to UE blueprints.
In PopcornFX Editor you can also import Alembic scenes, and export frames (yet to be documented, the record button at the top of the viewport) to be then use in other softwares like compositors for examples.