Event Processing Limitation

All discussions & material related to Command's Lua interface

Moderators: angster, RoryAndersonCDT, michaelm75au, MOD_Command

Post Reply
jkgarner
Posts: 175
Joined: Thu Apr 30, 2020 12:42 pm

Event Processing Limitation

Post by jkgarner »

Here is a question that tickles my brain:

How many events can Command handle at a time before it starts to bog down?

OK, that is way too simplistic. A person might create a recurring event that attempts to read in the entire Oxford Dictionary every 5 seconds. This single event would clearly bog the system down. Or a person may create a simple one-off timed event that pushes a message to the user. The system could probably handle many of these simple message events without any impact on the game. Granted, the hardware available may also impact what the system can do. I expect size of scenario, as in number of units being modeled would impact it as well, as memory on these machines is finite.

As I learn more about events, and I start creating ever more complex scenarios, my mind starts to wonder where the limit is.

So has anybody run up against the limit? If so, how large a scenario did you create? How many concurrent events were you trying to process, and what kind of hardware were you running on?
User avatar
TitaniumTrout
Posts: 472
Joined: Mon Oct 20, 2014 9:06 am
Location: Michigan

RE: Event Processing Limitation

Post by TitaniumTrout »

I was working on a CENTFRONT scenario where NATO units would be impacted by a jammer depending on how close the unit was to the jammer. It had to query all the units in the air, determine the distance, and then use the inverse square law to determine if the unit was jammed. It did bog down the scenario eventually and I was working with a ton of units. I'm certain I could have optimized the way I was detecting the jamming. But even with a massive quantity of units and some hefty recurrent math it worked. Ultimately nocomms was a pain in a scenario this large and became an unfun adventure.

Image
jkgarner
Posts: 175
Joined: Thu Apr 30, 2020 12:42 pm

RE: Event Processing Limitation

Post by jkgarner »

Sounds like a cool scenario, except that it bogged down.

Could you quantity what is a 'ton of units'? Hundreds? Thousands?
How many of them were moving and triggering the calculation?
How frequently did the calculating event trigger?
Was it whenever any one of the movers moved, or was it a poll every 5 seconds checking all movers?

BDukes
Posts: 2578
Joined: Wed Dec 27, 2017 12:59 pm

RE: Event Processing Limitation

Post by BDukes »

ORIGINAL: jkgarner

Sounds like a cool scenario, except that it bogged down.

Could you quantity what is a 'ton of units'? Hundreds? Thousands?
How many of them were moving and triggering the calculation?
How frequently did the calculating event trigger?
Was it whenever any one of the movers moved, or was it a poll every 5 seconds checking all movers?


I'm not sure Lua is really a culprit although I'm sure silent crashes and things that like could be issues. I think those are more edge cases though.

There has never been a hard fast guide on what constitutes too big as people and their systems are different. In general, the community has developed some rules of thumb to help. Obviously, you also get a lot from having a beta process with your scenarios.

Off the top of my head:
There are some Game options you can muck with to some extent. Go to Game Options the Game Dropdown and select Game speed Tab. See the manual for details although the descriptions are pretty good.
Scope your scenario well. CENTCOM ok but maybe focus on one area and populate that. You can restrict player movement with Nav zones.
Unit count (and detections) are a thing so when you don't need units don't add them.
If sides don't need to detect anything make them blind. Limits sides to what you need.
Consider adding things via Lua and cleaning them up via Lua to manage unit counts.

Hope this helps. I'm sure others will add stuff.

Mike

Don't call it a comeback...
User avatar
TitaniumTrout
Posts: 472
Joined: Mon Oct 20, 2014 9:06 am
Location: Michigan

RE: Event Processing Limitation

Post by TitaniumTrout »

ORIGINAL: jkgarner
Could you quantity what is a 'ton of units'? Hundreds? Thousands?
How many of them were moving and triggering the calculation?
How frequently did the calculating event trigger?
Was it whenever any one of the movers moved, or was it a poll every 5 seconds checking all movers?

The trigger cycled every 5 minutes and checked the range to the jammers of all units. I'd guess at just shy of a thousand units. This is where optimization would help as I didn't need to check units that were very far away. Ultimately the Out of Comms situation wasn't terribly fun with this many units so I dropped the project.

I did a similar situation with a GPS jamming script for GPS weapons and, as it was more specific, never saw any degradation of speed even with a high level of SDB spam.

There is a ridiculously huge Desert Storm scenario with something like 20,000 units. I've not played it myself but I think one of the devs called it a "good CPU benchmark".
KnightHawk75
Posts: 1850
Joined: Thu Nov 15, 2018 7:24 pm

RE: Event Processing Limitation

Post by KnightHawk75 »

My 2cents is it's less the number of events per say, and more what you are doing in each of them, and how whatever you are doing is implemented. Time your code to get a decent approximation of the expense.

The generic answer I'd give is keep any (ideally all combined) per-second events < ~50ms on your end *if you can* otherwise it can get noticeable, particularly if you have multiple that pile-up on a particular second boundary. Obviously it'll depend on hardware involved, 50ms for me might be 25 or 75 for someone else. Also where you have very expensive operations\events try not to have them on the same boundary as other events so that you spread the load where you can. IE just cause there are only 1:5:15:30 etc "triggers" doesn't mean you can not create your own based off of those for say 2 seconds or 6 seconds etc.

Generally when I hit that 50ms'ish threshold I start thinking about "do I really need to do X so often? How can I do less per cycle time? Can I split this up further so it's not all on the same 'second boundary' as a bunch of other expensive calls\events that also happen? Is there a way to do X more on-demand\trigger vs everyX? Can I trade memory for performance here by storing more data vs regenerating it? Am I printing anything I don't have to? (print calls are very expensive). Sometimes you can find away around, sometimes you just can't.

For example let say I have something I don't need to happen 'every second', but I do need it to happen about as often as possible, and running though whatever it is (lets just say processing some table of 5000 entries and doing something to say process\do a bunch of stuff to a sub-table of 50 entries for each of the original 5000. Doing it all at once lets say takes 250ms, and that's in addition to whatever else might also get run that 'second'. So I might consider splitting it up into a co-routine such that every second it'll process only up to 500 entries and take like only ~25ms per second. The downside being I can only assume the update for any given entry will be assured every ~10seconds vs one. Sometimes that works for what is being done, sometimes it will not.

Obviously all the other advise about scenario design for performance (outside of events) consideration applies as well.
Post Reply

Return to “Lua Legion”