These are chat archives for Automattic/mongoose

23rd
Nov 2017
Paul "Joey" Clark
@joeytwiddle
Nov 23 2017 07:29 UTC

@stevepsharpe I don’t know if this is your problem, but I sometimes find when I change an index in mongoose, it doesn’t actually change in the MongoDB. I sometimes have to drop that index so mongoose will rebuild it with the new settings on the next startup.

mongo
> db.collection.getIndexes() // check situation
> db.collection.dropIndex({…}) or db.collection.dropIndexes() // drop all of them

In your case, I’m wondering if unique: true is still there on the username field.

Steve P. Sharpe
@stevepsharpe
Nov 23 2017 08:22 UTC

Hey @joeytwiddle thanks for the reply. I dropped the table each time I changed the index, so I don’t think that is the issue.

I’ve looked on Github at various examples of people doing the same thing as I’m trying to do, although I’m not sure it works there either. They are either using a plugin, such as mongoose-delete, or are manually adding a deletedAt field.

The issue I’m seeing is that once a item is marked as deleted, and I try and create a new item with the deleted items username, I’m seeing a unique error still on the username because of unique: true on the username field, it doesn’t obviously care about the deletedAt field. This is why I thought of using the compound index on both username and deletedAt, as each time an item is deleted it will have a different date in deletedAt, making it unique and only a new item would have an empty deletedAt. However the issue is now if it does find a valid unique error on a username, it also return a unique error on the deletedAt field at the same time which seems messy.

I guess I’m trying to find the best way to implement soft delete with a unique field, it must be possible. Maybe the best way is not to use a unique index and instead try and check for dups before save or someone? I’m new to mongo. I’m from a ruby background and you’d just use the paranoia gem, job done.

Paul "Joey" Clark
@joeytwiddle
Nov 23 2017 08:25 UTC

Dropping the collection should do it. :thumbsup:

It sounds like you’re worried about the nature of the error message, even though the error is technically doing what you want.

The way I understand it, I actually agree with Mongo. It shouldn’t complain about a duplicate username (because you essentially allow duplicate usernames). It should complain about a duplicate (username, deletedAt) pair, and that’s what it’s doing.

So as far as I can tell, your problem is solved elegantly, except in your own mind. ;)

Steve P. Sharpe
@stevepsharpe
Nov 23 2017 08:34 UTC
@joeytwiddle My mind always gets in the way of progress! :smile: I think I just solved the issue schema.index({ username: 1, deletedAt: 1 }, { unique: true, sparse: true }); adding sparse: true ignores the null value on deletedAt, so I’m now just getting the unique error for username, not both. :tada:
Paul "Joey" Clark
@joeytwiddle
Nov 23 2017 08:37 UTC
Ah great. Yeah sparse: true is sometimes needed with unique: true. I haven’t used it on compound indices before though.