0 like 0 dislike
163 views

Hi! 
I have several 2d audio backdrops in my effect, and I have assigned a unique SoundChannelGroup property to each one:

I want to analyse the spectrum of each audio backdrop separately and sample them in scripts for different layers. This is how I setup it:

 

In the end, it seems that the ChannelGroup parameters is completely ignored by the CParticleSamplerSpectrum node, since the visible result depends on all the currenlty enabled audio backdrops (the Master channel group), regardless explicit channel group selection (see the picture above).

Is that a bug or am I doing something wrong?

asked by leavittx (220 points)

1 Answer

1 like 0 dislike
 
Best answer

Hi,

Indeed, there is no way in the PopcornFX Editor to specify other channels than the builtin ones, please find the list below:

  • Master
    • Music
    • UI
    • World_Master (Will fallback to Master)
      • World_Ambient
      • World_Collision
    • FX_Master
      • FX_Effect
      • FX_Collision
  • Speaker
So that hierarchy is in fact respected: If you add a 2D audio backdrop that plays inside the FX_Effect channel group, you'll be able to recover it by sampling FX_Master.
 
However, please note that this behavior is PopcornFX-Editor specific, and this logic is defined on a per engine basis. Basically we have a direct dependency on fmod in the PopcornFX Editor so each sound played within a specific channel will really play inside a fmod channel, that is not necessarily the case with other engine such as UE4 or Unity.
 
If you specify a channel group name that the editor doesn't support, it will fallback to Master, that is why you see no difference when changing to different channel names, because Master will basically encapsulate every other channel groups.
 
However, depending on the target engine (Unity/UE4/Custom), you can leave those custom channels active and they will be handled correctly in those engines.
 
Depending on the target engine, you should either consider these channels to be "flat", or maybe ask the engine programmers if this is a custom engine. Concerning the Unity and UE4 plugins, we let the actual implementation of audio data feed to the users, so that's up to you.
 
It is a good point and we lack of documentation about this, we might add the possibility to add new custom channel group names in the project settings in future versions of the editor.
 
Thank you for reporting this,
Sorry for the inconvenience.
answered by HugoPKFX (15.8k points)
Hi Hugo!
Your answer was really helpful, since I was able to use these 6 (Music, UI, World_Ambient, World_Collision, FX_Effect, FX_Collision) audio channels with unique spectrum analysers attached to them for the in-editor effect development, which was almost enough (I wanted 7, need to switch files for one channel) for my purposes!

I don't care about the engines that much at the moment, since I control the effect using parameters in the engine.

However, I would totally love be able to send the parameters values to editor using OSC messages from an external application/device (TouchOSC is a good example of such application). Should I add a feature request for that somewhere? Implementing such a feature should take a couple of hours, really :-)

Having the ability to create custom channel groups in the editor would also be a very nice feature, especially when prototyping audio-visual interactions.

Thanks
Hi leavittx,

Glad to hear that you could manage to sort things up on your side, this specific part is indeed something we need to improve regarding documentation and tutorials.

Regarding the engine part, what I ment is that the actual implementation of the audio sampling is handled in the target engine, not inside the PopcornFX-SDK.

What might be confusing here is that within the PopcornFX-Editor it seems that the PopcornFX-SDK/Runtime is the one in charge of providing the actual audio data, but that is not the case, we just provide in the editor a way to play audio data as backdrops to preview what it could look like inside the engine, and the PopcornFX SDK "fetches" that audio data inside the audio samplers.

What happens is that the PopcornFX-SDK is the one in charge of sampling the audio data, audio data that is provided by the target engine. And that encapsulation of audio channels is specific to the PopcornFX-Editor.

When you reach the point of integrating your FX inside Unity or other engine feel free to create another question in the answerhub.

Now, for the OSC part (that I didn't know about), after doing some research it definitely looks like a good idea, but maybe not directly inside the PopcornFX Editor/SDK. This should be handled inside the target engine.

What seems the best is finding an OSC plugin inside the target engine (for example UniOSC) in Unity, and depending on how this plugin works, you might be able to hook callbacks inside Unity. You could then, inside those callbacks directly set PopcornFX attributes, so you could have as most control as you wish, does that makes sense ?

I think it would be the best way to do this, instead of waiting PopcornFX to handle such a feature, that probably wouldn't make much sense for us to handle that.

Thank you !
...