These are chat archives for Automattic/mongoose

Apr 2015
Apr 22 2015 03:20 UTC
:point_up: 22 апреля 2015 г., 07:11
Hi @francesconero . I think that this is a problem nodejs, buffering. I solved a similar problem using async.eachLimit.
I have a similar problem occurred while transferring data from MySQL to MongoDB. For optimization, I endured the check of existence of the object from the model.
Francesco Nero
Apr 22 2015 05:41 UTC
@chetverikov thanks! Funny is that I'm also importing data from a sql db :). Unfortunately I also already tried the limited versions of the methods and it doesn't seem to help. Even by saving them in series I still get slowdowns after a certain number of docs is saved. I feel like I'm missing something obvious (and I really hope it's neither a Mongoose problem or, worse, a Node problem)
Apr 22 2015 07:18 UTC
@francesconero I wrote a library to migrate ( It is possible to perform asynchronous requests. Example from my code:
// transmit
  transmit: function(options, done, assets) {
    var count = 0
      , limit = 20
      , skip = 0;

    async.whilst(function() {
      return count < assets.count;
    }, function(done_whilst) {{
        places: mysql.getPlacesWithFields.bind(this, {limit: limit, skip: skip}),
        result: ['places', function(next, data) {
          methods.savePlaces(options, assets, data, next);
      }, function(err, result) {
        skip = skip + result.places.length;
        count = skip;

        if (!result.places.length && count < assets.count) {
          count = 999999999;

        result = undefined;


    }, done);

    return Event;

// from preparePlace method, dst, next); // - instance mapperjs
And, async mapper:
  [['fields.files_photo', 'fields.files_banquet_photo', 'fields.files_summer_photo'], function(data, dst, src, done) {
    var desc = {
        'fields.files_photo': {
          type: 'interior',
          title: 'Фотографии заведения'
        'fields.files_banquet_photo': {
          type: 'banquet',
          title: 'Фотографии банкетов'
      , prefix = ''
      , postfix = '/uploads/'
      , results = [];

    if (!data)
      return done(null, false);

    if (data['fields.files_summer_photo'] != null) {
      if (!data['fields.files_photo']) {
        data['fields.files_photo'] = data['fields.files_summer_photo'];
      } else {
        data['fields.files_photo'] = data['fields.files_photo'].concat(data['fields.files_summer_photo']);

      delete data['fields.files_summer_photo'];

    async.each(_.keys(data), function(key, next) {
      var source = data[key]
        , photos = [];

      if (!source || !source.length)
        return next();

      _.each(source, function(photo) {
          url: prefix + 'original' + postfix + photo,
          prefix: prefix,
          postfix: postfix + photo

      var album = new (mongoose.model('Album'))({
        title: desc[key].title,
        photos: photos,
        editor: dst.editor
      });, album) {
        if (err) return next(err);

          album: album,
          type: desc[key].type

    }, function(err) {
      if (err) return done(err);

      var albums = [];

      _.each(dst.albums, function(row) {
        if (row.type === 'photoreport') {

      _.each(results, function(row) {

      done(null, {albums: albums});

@francesconero :point_up: 22 апреля 2015 г., 13:41

Perhaps you missed "the return," which leads to the re-make or cause the following method.

Francesco Nero
Apr 22 2015 08:18 UTC

@chetverikov awesome library, where were you 2 weeks ago? :P

And forgive me, but I don't understand your last comment about not "the return", could you expand a little?

Apr 22 2015 08:35 UTC

Maybe you can leak memory because of late done caused?


  // •••

  if( /* bla bla bla */ ){
    done(); // Then the execution should stop, but since you omit the return code following conditions will be met. 
                 //You will not notice, but the memory will leak...

// •••
Damn... My English is bad.. +)
Francesco Nero
Apr 22 2015 10:47 UTC
@chetverikov thanks I think I understand now. It should not be my case though. The problem seems to be with the GC eats up most of my cpu the longer the process runs.
Francesco Nero
Apr 22 2015 15:42 UTC
Does anybody know how to upsert a document in order to increment a field if updating, while if in the insert case, set it to a default value (different from the increment amount)?
Valeri Karpov
Apr 22 2015 15:50 UTC

@francesconero unfortunately not, the closest thing that's available is the setOnInsert modifier: which sets a field iff its being upserted. As you noted though, just using $inc will get you a default value that's the same as the increment:

> db.test.update({ x: 1 }, { $inc: { a: 1 } }, { upsert: true });
    "nMatched" : 0,
    "nUpserted" : 1,
    "nModified" : 0,
    "_id" : ObjectId("5537c2cbd4eda404230ca38e")
> db.test.findOne()
{ "_id" : ObjectId("5537c2cbd4eda404230ca38e"), "x" : 1, "a" : 1 }

What are you looking to achieve?

also, re your first question, @francesconero, don't do a query to look up whether or not a document exists before inserting - that has a race condition. It's easier and cleaner to just use a unique index
Francesco Nero
Apr 22 2015 15:55 UTC
@vkarpov15 Yeah I thought so, unfortunately. I'm exploring the bulk methods since I need to save a lot of documents at once, which are coming from an sql db. I would like to set the _v field of the documents in order to save at least some information for Mongoose (which I use on the imported MongoDB database)
I'm issuing a find before saving to calculate a diff of what changed in the document. What would be a better way of doing that?
What kind of race condition could happen?
Felix Milea-Ciobanu
Apr 22 2015 17:50 UTC
is it possible to create a virtual property that loads data from a non-mongoose mongodb collection?
basically trying to load some user data into a virtual property from a collection managed by a python application
Felix Milea-Ciobanu
Apr 22 2015 18:00 UTC
right now I'm loading the mongodb driver module, but it feels like bad practive to use 2 mongodb libraries for one model