Ember.computed.filter with property invalidation in ember > 2.0?

I’ve been using this format for filtering arrays with ember pre-2.0

eventsFiltered: Ember.computed.filter('events', function(event /*, index, array */ ) {
   ... logic
}).property('events','filterShowA','filterShowB'),

Which has worked well. Initially all the ‘events’ are filtered but if any event is added the filter is run only once for the new item in the array. If any of the ‘config…’ values change then the entire array is re-filtered.

But in ember 2.0 the property addition to the filter is no longer working. I can force re-validation of the entire array by replacing .property('events',...) with .property('events.[]',...) but that is causing the entire array to be re-filtered every time there is a new item added. If i remove the .property() entirely then the behavior is correct.

Looks like this is all part of the effort to remove reduce computed because the core team thinks glimmer can save them on the performance hits.

In reality it doesn’t. Because you still have to generate the array for glimmer to process. For instance my app takes 2.5s to run through a filter of 388 items. With the old methods additions would be nearly instantaneous. With the new ones now there is a 2.5s screen lockup every time the array is mutated… Which happens often in my app…

https://github.com/emberjs/ember.js/pull/11513

https://github.com/emberjs/ember.js/issues/12453

Anyone have any suggestions on how to “refactor” away from them? Bottom line is that my app can’t run the filter / sort every time the damn array is updated because that’s a stupid waste of resources.

EDIT: it looks like i can bring them back using. GitHub - patricklx/ember-cli-reduce-computed It seems like a crutch though.