by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 20:42
    eschlot labeled #1708
  • 20:42
    eschlot opened #1708
  • 20:30
    eschlot commented #1707
  • 19:38
    mikhail-khalizev commented #1706
  • 18:55
    mikhail-khalizev commented #1706
  • 18:00
    lbnascimento commented #1707
  • 17:35
    eschlot opened #1707
  • 17:35
    eschlot labeled #1707
  • 17:22
    lbnascimento commented #1706
  • 16:02
    mikhail-khalizev labeled #1706
  • 16:02
    mikhail-khalizev opened #1706
  • 13:57
    lbnascimento commented #1705
  • 09:01
    mwkldeveloper edited #1705
  • 09:01
    mwkldeveloper edited #1705
  • 04:50
    mwkldeveloper edited #1705
  • 04:32
    mwkldeveloper edited #1705
  • 04:30
    mwkldeveloper labeled #1705
  • 04:30
    mwkldeveloper opened #1705
  • 04:01
    mwkldeveloper commented #1680
  • 00:49
    jjxtra commented #1680
Leonardo Nascimento
@lbnascimento
@bar10dr_gitlab try running exactly the same code in windows and inside docker (ruling out the easy options first)
Nicolai Linde
@bar10dr_gitlab
... So I run it in different dll's, I use dockers share function to map all instances to the same drive where the db is. My GraphQL image was started without a ~ before it, so it mapped to a different drive, with an old db copy.
Problem between chair and monitor confirmed.
Phil Carbone
@philcarbone
Hi - I am looking to use this to write a bunch of data (quickly) to an in-memory store but after that, I'd like to save it to disk. I am writing millions of records in seconds, so the in memory option is faster during all of that churn, but I'd like to then move it to disk without writing all the records again. Is this possible? I did something similar with Lucene.Net and it worked very well.
Leonardo Nascimento
@lbnascimento
@philcarbone You can do this by instantiating LiteDatabase with a MemoryStream and, after usage, dumping the entire MemoryStream to a file
Leonardo Nascimento
@lbnascimento
@philcarbone Just make sure to run db.Checkpoint() before dumping the MemoryStream
dtaylor-530
@dtaylor-530
@lbnascimento - 'StackOverflow' was a hyperlink, but here's the url as text: https://stackoverflow.com/questions/57139624/how-to-check-if-litedb-database-file-has-password-or-not-in-c
Leonardo Nascimento
@lbnascimento
@dtaylor-530 I'm gonna answer there too to keep as a reference, but you can test it using only the first byte from the datafile
If the first byte is 0, then the file is not encrypted
If it is 1, it is encrypted
We may use values other than 1 in future versions for different encryption modes, so it is preferable to test for 0 than for 1
dtaylor-530
@dtaylor-530
@lbnascimento Thanks - i tested the answer and received 0 as the first byte for an unencrypted files but for encrypted files i got two different values - 73 & 176. Here's the code i was using
           IsDbPasswordProtected(string path) 
           {
            using (FileStream fs = File.OpenRead(path))
            {
                byte[] byteArray = new byte[1];
                int numBytesToRead = byteArray.Length;
                while (fs.Read(byteArray, 0, byteArray.Length) > 0)
                {
                }

                return byteArray.Single() != 0;
            }    
            }
dtaylor-530
@dtaylor-530
*N.B the values 73 & 176 were for two different files.
Leonardo Nascimento
@lbnascimento
@dtaylor-530 Could you send me these datafiles?
Leonardo Nascimento
@lbnascimento
@dtaylor-530 I believe your code is not correct, try something like this:
bool IsDbPasswordProtected(string path) 
{
    using (FileStream fs = File.OpenRead(path))
    {
        return fs.ReadByte() > 0; 
    }    
}
dtaylor-530
@dtaylor-530
@lbnascimento Thanks a lot - i added your answer as a comment on Github. Incidentally the questions were related to the work on this pull-request: https://github.com/julianpaulozzi/LiteDbExplorer/pull/50/.
micah686
@micah686

I have a question about using Upsert().
So let's say I have this method:

using (var db = new LiteDatabase(App.DatabasePath))
{
    var col = db.GetCollection<EntryInfo>("entry_coll");
    var entry = new EntryInfo()
    {
        EntryId = 99,
        Title = "SampleTitle",
        Original = "FirstString"
    };
    col.Upsert(entry)
}

Which adds the entry into the database, but then later on, I call this section of code:

using (var db = new LiteDatabase(App.DatabasePath))
{
    var col = db.GetCollection<EntryInfo>("entry_coll");
    var entry = new EntryInfo()
    {
        EntryId = 99,
        Title = "SampleTitle",
        Original = "UPDATEDString"
    };
    col.Upsert(entry)    
}

Which adds another row, instead of updating the existing row. How do I tell it to update what's changed between the 2 entries?

Leonardo Nascimento
@lbnascimento
@micah686 You mean you want to update one field and leave everything else unchanged?
micah686
@micah686
I figured out what the problem was. I was doing a new EntryInfo(), which was adding a new item with a higher index, not editing the one I wanted
micah686
@micah686
So, I'm having trouble getting a collection out of the database after adding it
using (var db = new LiteDatabase(App.DatabasePath))
                {
                    var dbTags = db.GetCollection<VnTagData>("vntagdump");
                    List<Tag> tagDump = (await VndbUtils.GetTagsDumpAsync()).ToList();
                    List<VnTagData> tagsToAdd = new List<VnTagData>();
                    foreach (var item in tagDump)
                    {
                        var data = dbTags.Query().ToList(); //suceeds here on first add to db, crashes anytime after the first run
                        var prevEntry = data.FirstOrDefault(x => x.TagId == item.Id) ?? new VnTagData();
                        var entry = prevEntry ?? new VnTagData();
                        entry.TagId = item.Id;  //uint
                        entry.Name = item.Name; //string
                        entry.Description = item.Description; //string                        
                        entry.Category = item.TagCategory; //enum {Content, Technical, Other}
                        entry.Parents = item.Parents; //ReadOnlyCollection<uint>
                        tagsToAdd.Add(entry);
                    }
                    dbTags.Upsert(tagsToAdd);
                    Debug.WriteLine("Done");
                }
on the line "var data = dbTags.Query().ToList()", it crashes the second time I run it
the first time I execute it, there is no database or collection/table
after I execute it once, there are 2560 elements in the table/collection
and I get the following error:
LiteDB.LiteException: Failed to create instance for type 'System.Collections.ObjectModel.ReadOnlyCollection.....
System.ArgumentException: Type 'System.Collections.ObjectModel.ReadOnlyCollection`1[System.UInt32]' does not have a default constructor (Parameter 'type')
micah686
@micah686
Here are the relevant files that are showing the error
Leonardo Nascimento
@lbnascimento
@micah686 The issue is with the ReadOnlyCollection<T> fields in your classes. Our deserializer tries to instantiate them, but fails because they don't have a default constructor
@micah686 If you really need to use ReadOnlyCollection, you must create a custom constructor for your data classes and use the BsonCtor attribute to tell LiteDB to use it
@micah686 Something like this:
[BsonCtor]
public Tag(UInt32 _id, String name, String description, ..., BsonArray aliases, BsonArray parents)
{
    Id = _id;
    Name = name;
    Description = description;
    ...
    Aliases = aliases.Select(x => x.AsString).ToList().AsReadOnly();
    Parents = parents.Select(x => (UInt32)x.AsInt64).ToList().AsReadOnly();
}
James Allen
@jjallen37

Hello LiteDb Friends! I'm running into a performance challenge dealing with long-running queries. Is there a way to cancel a query? We are loading a list as a user sets filter criteria and I would like to cancel the previous operation when the filter is updated.

I'm on LiteDb 5.0.8. As a note, LiteDb 5 w/ partial deserialization has allowed us to increase performance by a substantial amount and we're really enjoying using it.

Leonardo Nascimento
@lbnascimento
@jjallen37 I think you could achieve this by disposing an IBsonDataReader
@jjallen37 You can get an IBsonDataReader either by calling LiteDatabase.Execute(string sql) or by calling LiteCollection.Query().ExecuteReader()
James Allen
@jjallen37
I think that is probably what I'm looking for, thanks! I would be using it in the latter case LiteCollection.Query(). In my case, we go all the way through like this: Query().Where().OrderBy().Select().Limit().Offset() then finally to ToList(). Would there be any functional difference between the reader returned from Query() and Offset()?
Leonardo Nascimento
@lbnascimento
@jjallen37 No, they both work the same
@jjallen37 The only caveat to this solution is that you'll have to call ExecuteReader() after these calls, and now you're dealing with an IBsonDataReader
And you'll have to iterate through it manually, similarly to how you would iterate manually through an IEnumerable
Something like:
Leonardo Nascimento
@lbnascimento
Yes, except you add another variable continueReading or something like that to the while condition
James Allen
@jjallen37
Very cool. Thanks for the help! I'll report back on how things go.
I was about to submit a GitHub issue about something, but I feel I should go ahead and check while I'm here. I want to clear all my Indexes for a collection, but I can't find out which indexes I already have set. Is there any functionality to return indexes or to drop all existing ones?
Leonardo Nascimento
@lbnascimento
db.Execute("select name from $indexes where collection = 'colname'") returns the name of every index for the collection colname
Marcus
@ColonelBundy
Is it possible to provide mapping without a backing class? Creating a collection by name is already possible, but is there a way to tell the mapper that some field is actually a dbref etc?
Leonardo Nascimento
@lbnascimento
@ColonelBundy Collections can be opened either in "typed mode", in which it takes and returns objects of type T, or in "untyped mode", in which it takes and returns BsonDocuments
@ColonelBundy So, if you are working with an untyped collection, you need to do everything manually, including setting the reference subdocument correctly
Marcus
@ColonelBundy
I see, thanks :)
Leonardo Nascimento
@lbnascimento
@ColonelBundy A reference subdocument is of the form:
{
    '$id': 1,
    '$ref': 'customers'
}
Where the $ref attribute stores the name of the collection to which it refers and $id stores the id of the referenced document