Why is playing audio done differently from playing particles?

This is less of a problem and more of me trying to understand something. In “How To Trigger Particles”, the challenge asks us to make code that plays particle effects.

My solution was, in CollisionHandler.cs, to add the following lines:

In the Class:

[SerializeField] ParticleSystem crashParticles;
[SerializeField] ParticleSystem successParticles;
ParticleSystem particlesystem;

In Start():

particlesystem = GetComponent();

In StartCrashSequence:

particlesystem.Play(crashParticles);

In StartSuccessSequence:

particlesystem.Play(successParticles);

The actual answer, though, was to use

successParticles.Play();
crashParticles.Play();

The following lines were completely unnecessary:

ParticleSystem particlesystem;
particlesystem = GetComponent();

This feels weird to me because it doesn’t work like the audio ones. Why does it work this way instead?

It’s because particles and audio are two very different things.

The audio system is just a player that plays audio clips. The clips are just the data that needs to be played, and it can be played outside of Unity as well because it’s just mp3 or wav or ogg files.

The particle system is the whole system. Each particle system contains all its goodies and is run by itself - not some other player. It cannot run outside of Unity (unless someone creates/created some player that can run it).

It’s like a CD that can play on any CD player, but I don’t have a player for my refrigerator, so it needs to run with goodies inside itself.

Thanks. There was a lot of stuff in the video that was comparing it to the audio system, so I was thinking of it as being like the audio system.

That’s fine. We know that Unity is complex and a bit confusing. Don’t hesistate to ask if you have any questions. :slight_smile:

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.

Privacy & Terms