Promise sequence? - saving a whole bunch of records with ember data


I have a question which I put up on stack overflow which I’m quite interested in…

I save a whole graph of objects using ember data and I’d like a way to not overload my server. As it stands, if you forEach .save() on a whole bunch of objects, it seems to fire them all off at once… and what I’d ideally like is for them to be sequenced on completion of the preceding ones.

Does anyone have any thoughts on how to achieve such a thing, or if I’m doing it wrong?


You might want to look at You’d have to check that the promises were compatible but the queue would probably be a good fit.


Thanks but if all I wanted was to iterate over a bunch of functions I could just use reduce as in the example here:

This kind of provides the answer I’m looking for but I’m sort of working on code that looks like the following now, though it’s not currently working (just debugging it):

var block1 = Em.Object.create({save: function() {return new Date().toString()}});
var block2 = Em.Object.create({save: function() {return new Date().toString()}});
var block3 = Em.Object.create({save: function() {return new Date().toString()}});

var values = [block1, block2, block3];

// want to sequentially iterate over each, use reduce, build an array of results similarly to map...

return values.reduce(function(memo, current) {
  var last = memo.pop();
  return memo.concat(last.then(function(object) {
      return [object,];
}, [Ember.RSVP.resolve()]);

(further update)

Okay so I managed to get it working the way I wanted, I think… here’s the code:

var block1 = Em.Object.create({save: function() {return new Date().toString()}});
var block2 = Em.Object.create({save: function() {return new Date().toString()}});
var block3 = Em.Object.create({save: function() {return new Date().toString()}});

var values = [block1, block2, block3];

// want to sequentially iterate over each, use reduce, build an array of results similarly to map...

return values.reduce(function(memo, current) {
  return memo.then(function(object) {
      if(object == null) {
          return [];
      } else {
          return object.concat(;
}, Ember.RSVP.resolve(null));

No, I don’t think that’s right. Sadly I don’t think I’m clever enough to think about this :frowning:


So there’s a promises port of async.js, It has a function called mapSeries that does what you want.

If you want to implement it yourself you need to treat the memo as the list and the last item as the promise(your version is trying to treat the memo as both the list and the promise).


Yeah, I tried it both ways and neither worked for me… I was rolling it myself because I initially thought "oh this is just a reduce to map, I can easily do that)… but of course promises add an extra layer of complexity that offen breaks my brain beacuse of the async.

Subsequently I realised it solution is actually an incorrectly architected response to the problem I’ve got, but that’s probably another story.

Thanks very much for the library reference. I’ll go look at it. I’m hesitent to add an entire library for a single function, but I’ll definitely check it out and maybe just use it to instruct me on how to do it :wink:


The main problem I seem to be having seems to be that I need to coerce an array of VALUES into a set of results of having a particular function called on them (save in my case)… whereas most of these libraries / approaches expect that I’ll have an array of functions which have the function being called on them inside them. (ie [ ->, ->, ->] -> eval each function -> [aResults, bResults, cResults] as averse to [a,b,c] -> call save() on each value -> [aResults, bResults, cResults].

But I’ll continue. I’m sure I can work it out, I just need a half-hour of deep thought time. Thanks for your responses.


So this is as far as I got, but I’m getting a list of resolved promises, not values back… and I have no idea how to unwrap the promises. :smile:

return values.reduce(function(memo, current) {
    var last = memo[memo.length - 1];
    return memo.concat(last.then(function(results) {
}, [Ember.RSVP.resolve(null)]);

This is my JSfiddle…

It’d be really sweet to get some help from someone magical here, if they’re around or something :slight_smile: I’m expecting to see three string objects, but I’m getting four promises, each of which contains one of the strings except for the first one which contains the original null value.


I think should do the job for unwrapping the promises.


Oh how obvious…thanks. Though I have always wondered how to unwrap promises in general, I can’t use value, data, value(), data(), or fulfilledValue… it’s a bit odd.


I suppose the point is that the code shouldn’t have to know if the promise has been resolved or not. If the promise had a value() method it might return null or the value depending on whether or not the promise had resolved.


yeah, I get that… it’s just not particularly nice to program with.

This is where I got to, which feels ugly as all sin to me:

var x = values.reduce(function(memo, current) {
    var last;
    if(memo.length < 1) {
        last =;
    } else {
        last = memo[memo.length - 1];
    return memo.concat(last.then(function(results) {
}, []);
return Ember.RSVP.all(x);

Thanks very much for your help.

Promises kind of feel like they’re a really crappy solution to a terrible problem. This feels like it should be solved at the language level, not the framework level, and I guess that’s the point of getting promises into ES6… however, it feels like there should be a better way to deal with sequencing or parrallelization of code… the programmer should be able to specify whether they’re thinking in sequence or parrallel. Traditionally a new statement or expression means “when the last one has finished”. javascript on the web has kind of violated that… and in fact all async code/frameworks/etc. have violated that. Switching to a functional style would solve most of these problems, but it’s just a really shitty place to find oneself in, generally… using promises, or callbacks, or any of it. It breaks the mental model of decades of programming and doesn’t replace it with a better one.


Ha, I agree about promises feeling like a band aid. I’ve reached for that async library a lot in the past (before ember) and found it to be an elegant set of apis for serial vs parallel etc.

With regards to your implementation, have you thought about passing in the promises rather than the objects? Something like:

var Record = Ember.Object.extend({
  save: function(){
    return Ember.RSVP.Promise.resolve(true);

var objectA = Record.create();
var objectB = Record.create();

function mapSeries(promises){
    if(!promises || promises.length == 0) return Ember.RSVP.all([]);

    var pending = promises.reduce(function(memo, current){
        return memo.concat(memo.get("lastObject").then(current));
    }, [promises.shiftObject()()]);

    return Ember.RSVP.all(pending);



You don’t mean the promises here, you mean the function that creates the promises… right? in this case, save… seeing as I’m only ever calling save (ie I’m never changing the function that I’m calling), I’d have to map all of the objects to their save function first, which isn’t advantageous in any way, so I haven’t thought of it, no.

In this case, if you’re sending through a set of functions, it isn’t a map series, beacuse map calls a function on a sequence…, it’s a call in series… that is… you’re calling the passed in functions in series. Things get quite complicated quite quickly and easily, don’t they? :wink:

I’d have to map the values by a function that obtained their save functions… ie… something like this: setOfValues.mapBy(“get”, “save”); or somesuch… ( {return}):wink:


What I want is a mapSeries function… and I’ve got there, thanks. I’m just irritated and annoyed at the fact that the mental model required to build such a thing was less than ideal. Sort of like having to deal with recursion, but add in a couple of powers of complexity. Recursion is already difficult enough, if you ask me.


Yup, you’re right I mean function that create the promises and you’re right about the name, I guess something like runInSeries would make more sense. Learnt a couple of things golfing that one so it was an interesting exercise. Might have a crack at turning it into a proper mapSeries…


Ok, this is a bit better

var Record = Ember.Object.extend({
  save: function(){
    return new Promise(function(resolve, reject){
        }, 1000);

var objectA = Record.create();
var objectB = Record.create();
function mapSeries(objects, callback){
    if(!objects || objects.length == 0) return Ember.RSVP.all([]);

    var pending = objects.reduce(function(memo, current){
        var next = new Promise(function(resolve, reject){
                function(value){ resolve(value) },
                function(reason){ reject(reason) }
        return memo.concat(memo.get("lastObject").then(next));
    }, [callback(objects.shiftObject())]);

    return Ember.RSVP.all(pending);

function saveInSeries(records){
    return mapSeries(records, function(record){ return })

saveInSeries([objectA, objectB]).then(function(resolved){   


It’s not easy to stay simple, is it?


My original, and higher-level issue here was that my server has only got a certain amount of resources (let’s say it’s about 20 to 30 connection width in terms of concurrency).

My save function is actually a lot more complicated than the simple example I’ve been discussing here. My save function saves depth-first, and saves an entire graph (a tree) of objects, including incidental associated objects.

What I actually want is a way to tell ember-data the maximum concurrency of the server… and to have it respect that, and build and evaluate a queue… interestingly this is a similar problem domain to the one that the ember runloop deals with in the UI level - how to schedule, reschedule and run various set of instructions that have impact on a stateful environment.

And even then, at a certain point, even with some awareness of concurrency… this is just one single client we’re ever talking about… It’s quite possible that 200 users might all be trying to save an entire tree of records at the same time… in which case I’d like some of them to wait, and retry… up to a certain amount of time (say 10 seconds) after which it should inform the user and ask them what they’d like to do, or possibly just retry later… (in 30 seconds a few times or something).

This stuff, though, is kind of the province of the framework, not the application… it’s a feature of the data persistence layer, or at least should be. I shouldn’t be writing custom save code that manages concurrency, especially when it’s possible that the server and client framework really should be communicating about what’s possible.

(Point in case… sometimes the server might have capacity to handle 200 width concurrency connections, other times it might have only the capacity for 4 concurrent connections per client… this should depend on the load on the server, the resources available to the server, and a whole bunch of other things, but really it should be dynamic, not static, and definitely not application-level programmed).

Interesting. Food for thought. In the meantime, I’m going to go mitigate my issues in the application layer… because that’s where I’m seeing my symptoms, but I’m quite aware that I’m only putting a bandaid over a problem that will rear its head quite soon again.


Hehe, yeah, it’s a tough one, ember is probably big enough as it is without including all the useful little promises utilities you might want. I think it would make sense for ember-data to have config for throttling the number of connections though. Going back to the async library you’ll see that the queue does actually allow you to specify how many tasks are allowed to be performed in parallel so you could use that for throttling.


Do you have a control over the server?

In that case, consider not submitting objects one by one. There’s a bunch of stuff that can go wrong: a client using connection mid-save is just one, and you don’t have any atomicity guarantee.

What I’ve done in our application was to override Ember Data’s serializer for my root object and nest all the other associated objects in the same primitive-object tree. Then I wrote a custom deserializer on the server side which parses the object tree, creating domain models and saving them to the DB.

It took a huge effort to do this and made me dislike the Ember Data, but I still think it’s better than dealing with all the corner case I’d found when the application met the real world and all kinds of flaky connections.



Yeah I have complete control over the server. That sounds like very good advice, and I also completely agree with you.

Unfortunately, though, I don’t have enough time to do what you did, I don’t think… partially because I feel that perhaps my app might be actually fairly much more complicated, and partially beacuse I’m saving a fairly large tree of objects that has no guaranteed pre-known structure… (sometimes it’s like 5 objects, other times it’s around 55 or so, but it could be more). (It’s a recursive parent/child structure).

I only have one client who uses my app who has this very large set of data, and he understands he has to be careful with it (which sucks, but what else am I going to do short-term? I’m just happy it works at all at this point because of the problems I’ve had with ember-data).

The client side not being atomic is pretty bad, but the server isn’t really transactionally atomic either, so it’s kind of all bad. I’m using Rails, and … well, it’s just not very good, in part because of the way it (ab-)uses relational databases, and in part because it’s just not very good. I should be using something like clojure & datomic, really… something that pull change out and says “hey! don’t do that, it’s a bad idea”… and I plan on rewriting using something like that for the backend at some time in the hopefully not so distant future, but at the moment I just have to get it working enough for people to use it.

I kind of half-wish Ember was built in clojurescript, actually.

Who knows, I might just have to do exactly what you did in the not so distant future anyway, once I become sick of wrappering bandaids around things. (As it is, saving those really big trees takes like 5 to 10 seconds and the view flickers a lot while it’s happening… one of the bug fixes I really need to do, actually, is make it so that a “please don’t do anything while this is working” thingie appears while it’s saving…)



Yes, that’s been my main motivation to serialize. However, I didn’t manage to fully get around slowness and flickering as I eventually realized I needed to get all objects from server after they were saved to ensure associations work as expected. That in itself was tricky to get right, but at least data consistency wasn’t getting compromised.