Is there still a batch size limit in trigger?

You're confusing queries for DML operations. DML operations are always batched to sizes of 200 records maximum (100 if it's a really old API version, for backwards compatibility reasons). The 200/500/2000 row limit applies to the size of a single query result without using queryMore.

See Triggers where they discuss this "batch size" in regards to older API versions:

In API version 20.0 and earlier, if a Bulk API request causes a trigger to fire, each chunk of 200 records for the trigger to process is split into chunks of 100 records. In Salesforce API version 21.0 and later, no further splits of API chunks occur. If a Bulk API request causes a trigger to fire multiple times for chunks of 200 records, governor limits are reset between these trigger invocations for the same HTTP request.

I can't seem to find the documentation that states that triggers execute in chunks of 200 records, but you'll notice that it's pretty much in all the official (and unofficial) literature out there.


It is worth noting that for Platform Event subscribing triggers, at least as of Spring 19, the maximum trigger batch (chunk) size (Trigger.new) is 2,000

An extensive thread on this topic can be found with the Platform Events PM on the Platform Events Success Community courtesy of our own @DanielBallinger

Just to clarify, I've got the following trigger to go with the anonymous Apex from the original question. I'm getting debugs for Trigger.New.size() up to 300.

So the optimistic tuning can allow the processor to run on more than 200 events in a single transaction?

I could see sizes greater than 200 catching people out.

... Is there an upper bound on the Trigger.New.size() that could occur? I'll need to test that Platform Event triggers are sufficiently bulkified.


trigger TestEventTrigger on TestEvent__e (after insert) {
   System.debug('Size: ' + Trigger.New.size());
}

and the response from the PM:

Jay Hurst (Salesforce)

Up to 2000