A friend of mine was put in charge of completely rewriting their company’s customer interface. They work in the health sector and their product / client is used by over 10 huge, national health insurances. Their MySQL databases are over 400 GB big and contain over 1000 different tables.
It is a vast amount of data.
The current client is an age old Java behemoth. It became unmaintainable and they want to switch for a web client for various reasons: better control, easier updates, easier installation (= none), etc.
He asked me, whether Ember.js would be a good choice for that. Naturally I said yes.
However, I see a huge problem: The number of models that would be loaded by the client. If they account for similar tables, partitions and hidden / access restricted tables, they’d probably still end up with about 500 to 800 different models. It is insane.
This also poses a UI/UX problem; how would you want to put 800 different models in one UI? I couldn’t think of a proper structure. That’s why I recommended to split up the app into as many different parts as possible. This would become easier if engines landed, but that’s another story.
Anyhow, they’d still be left with about 100 to 300 models per app. Would that number of models have an impact on performance? Are there benchmarks for this scenario?
I could imagine that ember-data or the browser could crush under this sheer amount of data, as every single model class is it’s own JS object.
Do you have any experience with this huge amount of models per app?
Thanks a lot.