Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Dec 05 15:57
    justinclift commented #2918
  • Dec 05 12:08
    VasiliPupkin256 commented #2918
  • Dec 05 12:02
    chrisjlocke commented #2906
  • Dec 05 12:00
    chrisjlocke commented #2918
  • Dec 05 11:50
    chrisjlocke commented #2918
  • Dec 05 10:05
    VasiliPupkin256 commented #2918
  • Dec 05 09:47
    chrisjlocke commented #2918
  • Dec 05 08:21
    VasiliPupkin256 commented #2918
  • Dec 05 07:41
    chrisjlocke labeled #2918
  • Dec 05 07:41
    chrisjlocke commented #2918
  • Dec 05 07:40
    chrisjlocke commented #2918
  • Dec 05 01:35
    VasiliPupkin256 opened #2918
  • Dec 04 20:29
    Marsupilami8 commented #2561
  • Dec 04 00:43
    chrisjlocke commented #2271
  • Dec 03 19:16
    MalikRumi commented #2271
  • Dec 02 23:45
    gennaios commented #2612
  • Dec 02 23:44
    gennaios commented #2612
  • Dec 02 10:29
    justinclift commented #2612
  • Dec 02 10:20
    justinclift commented #2612
  • Dec 02 08:32
    chrisjlocke commented #2917
DavidKonicek
@DavidKonicek
Hello. Where to report bugs I found in SQLiteBrowser? Here? Somewhere else? Thanks.
Chris Locke
@chrisjlocke
In DB4S if you click 'Help' and 'Bug Report' it'll take you to the issues page on GitHub. You'll need a GitHub account.
https://bit.ly/submitBug
It's not recommended to submit it here, as it can't be easily followed up or referred to later on. The 'Issues' section on GitHub is the best place.
brightinnovator
@brightinnovator
I have a question on javascript. can someone help me on the javascript issue?
brightinnovator
@brightinnovator

I want to read 3 crore csv rows which is of 2GB csv file size and need to insert into MySQL via Java.

Could someone please help me know the fastest and memory efficient way to avoid out of memory exception as well load in lesser time?

Please kindly advise.

David Reilly
@greyblue9
MySQL or sqlite?
Selvaganesh, what is the result you get? Does it not match your expectation?
David Reilly
@greyblue9
I think you'll have trouble if the coloms in the two topping tables aren't exactly the same unless you explicitly select the common columns.
flywire
@flywire
Is there a demo for creating a report with a GUI?
Chris Locke
@chrisjlocke
There are no reporting capabilities within DB4S - it manages the database only.

I have a question on javascript

This is primarily a support chat area for DB4S which is a database management tool. I don't believe the users familiar in it are also familiar with Javascript.

flywire
@flywire
Can you recommend a GUI report tool(s) that might go with it?
btw, Can I get Gitter to send me an email for responses?
Chris Locke
@chrisjlocke
https://www.helicalinsight.com/open-source-bi-tool-for-sqlite/ ? There are paid-for solutions, but I always tend to drift towards open-source offerings first.

Can I get Gitter to send me an email for responses?

If you click the settings (assuming you're on the desktop browser version) there are notification options in there. Click 'Notifications' and not 'Settings' (once you've clicked the settings icon, I mean...)

flywire
@flywire
:thumbs up: x2
Aaron Mason
@thirdwheel
Is there a way to turn off the suggestion box that pops up as you're entering data in the Browse Data tab? It is useful from time to time but there are times when it slows me down
Chris Locke
@chrisjlocke
image.png
If you click 'Edit' and 'Preferences' then the 'Data Browser' tab, there is a 'threshold for completion' option. Set this to zero and the functionality should be disabled.
1 reply
Carlos W. Mercado
@carloswm85
hELLOW
I was writting some queries for the first time using sqlb
I was wondering, what's the difference between 'String', "String" and String when running any query? I noticed they're all work when asking data from the DB, using those DB names and row names.
Chris Locke
@chrisjlocke

Carlos - You have a number of SQL engines (mySQL, Microsoft, etc) and SQLite tries to help out by being compatible with them as they all work a little differently. This means you could use an apostrophe or quote but SQLite tries to work out what you really mean.
Generally, strings are enclosed in apostrophe's, so select from table where field1='Fred' means it'll find data that matches the string 'Fred'. If you use field1="fred" then it tries to find a column (or table, view, etc) that matches "fred". BUT, if it doesn't find one, SQLite thinks, "aah, did you mean 'Fred'?" so that might also work. You shouldn't rely on these though, and should ideally get the quotation marks correct.
Ideally, all fields and tables should be a singular word without spaces. So don't call a table "table 1". If you do, you have to put quotation marks around it (not apostrophes) when using it in SQL - eg, select
from "table 1" where field1='Blah blah'

Does any of this make any sense?

jessie-github
@jessie-github
Hello, it seems that I am running into this issue with version 3.12.1 in Windows with a big database (100GB, one big table): sqlitebrowser/sqlitebrowser#584
I don't have any issue with a 5GB database.
On the "Browser Data" tab, the row count is displaying after a few seconds at the bottom and after that it is frozen with the bottom right message "Busy (reading rows)".
I have set the prefetch value to 255, but it doesn't help.
CPU usage is high but I still have enough RAM available.
Can somebody tell me if I have missed something ? I don't want to re-open the issue and bother the developers if I made a mistake... Thanks !
Chris Locke
@chrisjlocke

I have set the prefetch value to 255

I don't believe this helps in this instance - it only helps when scrolling through the table - it'll grab 255 records at once. Usual values are ~50,000 so it grabs 50,000 records in a go.

Without being rude, why are you using the data browser tab for such a large number of records? At 100 GB, this is going to be a lot of rows ... why would you want these all displayed? Usually, once the data is in the database, you perform actions on that data - how many records with a value of X in field Y, how many records with a value of Y in field Z, etc. You can perform that using SQL - eg, you don't view all million rows to find out there are a million rows. If you're scrolling through data, you can use 'limit' to make this manageable.
In my eyes, this is like opening a 900,000 page word document, then wondering why it takes a while to scroll up and down. Some will argue that DB4S should handle 80 million rows, but I fail to see the advantage of viewing this on screen... (just my personal opinion, of course!)

I don't want to re-open the issue and bother the developers if I made a mistake...

I would recommend opening a new issue, and making reference to the other issue - don't open an older one. The older one refered to a different version of DB4S, plus it will notify the original poster, when really you need your own new shiny issue that belongs just to you. Open it up as it might highlight something wrong with DB4S that can be fixed.
There is already an open issue where the number of rows visible in the grid is limited - possible due to the control we use in DB4S. This comes back to my (waaaaaaaay too long) point above that displaying a large number of results is a nuts thing to do anyway.

jessie-github
@jessie-github
This is a valid question. The database is a dump of important information that we want preserved and available for query (i.e. an archive) and/or to be used as an input for other tools (i.e an alternative to CSV files). However in some case, we want to be able to query the database (it is indexed of course), to look at a particular record. For instance if we have a bug somewhere, we may want to look at the record and its surrounding.
Perhaps this is an edge use case, but I am trying to promote SQLite over CSV files (for large exchange files), and people need to have a look at the data before it is processed.
(for instance on emacs I can open a 100GB CSV file without loading the complete file in memory, so this is instantaneous)
Chris Locke
@chrisjlocke
My point is (or was...) is someone going to load a grid with twenty million records and scroll through the lot? Or are they going to narrow this down into chunks.
You can't compare DB4S to emacs - it's like my example above - Word wouldn't open a 90,000 page document.
An SQLite database is a million times better than a CSV for querying - a database is built for querying, while a CSV isn't.
The 'browse data' tab is designed for small tables as it loads every record and relies on a filter. What would be better (for your example) is a grid asking 'what do you want to look at?' (eg, a numeric range or date range) so you can scroll through that. Again, a subset, rather than the complete table.
jessie-github
@jessie-github
Indeed and this what I am often doing as well. Sometimes I just want to browse through the first hundred of lines to see what is the content and other times I want to filter to have only have rows corresponding to the criteria. Problem here is that the GUI is frozen and I cannot enter any filter at the top of the column.
I am using (abusing ?) it a bit like I would use Excel to browse/filter/display a table with a lot of data.
For debugging it is really useful. Manipulating large CSV files (as input for other tools), it is really useful to be able to open the file (without loading the entire file in memory) and explore a bit the content of the file to see how to parse it (for instance to check that the date are of the standard YYYY-MM-DD instead of MM/DD/YYYY).
Similarly for SQLite file, I wanted to have a peek without having to (manually) type a query to display the first hundreds of lines.
jessie-github
@jessie-github
But if DB Browser for SQLite is really built in a way to forcibly load the entire database when using the "Browse Data" tab, no problem I'll try to find another way. I just thought that it would be useful to offer the possibility to load in chunks (like the emacs vlf package) but I can understand that it is not a feature which is often requested.
Chris Locke
@chrisjlocke

But if DB Browser for SQLite is really built in a way to forcibly load the entire database

Not intentionally it doesn't - there is a lot of work to make it as smooth as possible (the reason for the 'prefetch block size') but the first thing it has to do is work out how many rows the table has, so it does a 'select count(*) from table', and it's this that can hammer large databases. Even with an index, a 100 GB database is going to take time to get this figure.

jessie-github
@jessie-github
But are you not using max(rowid) to estimate the number of rows ? It seems that it is working (I can see the number of rows at the bottom quite quickly)
jessie-github
@jessie-github
Yes only 5-10 seconds to get the row count for a 100GB database, so I am pretty sure that it is using max(rowid)
Chris Locke
@chrisjlocke
Max(rowid) doesn't tell you the number of rows in a table .. that just shows the highest row number. For tables with auto increment on where rowids aren't recycled, that's not going to be accurate at all - i have a table with one row with a rowid in the thousands...
jessie-github
@jessie-github
I am not sure, but otherwise how is DB Browser for SQLite is calculating the number of rows so quickly (5 seconds for a 100GB file ?). I never deleted any rows so rowid is accurate. Perhaps I should try deleting some rows to see if the number at the bottom is changed. But I thought it was impossible to have an accurate count of the number of rows quickly (i.e. faster than "select count(*)" which is taking ages).
jessie-github
@jessie-github
Ok I tried and deleted some rows and the count displayed by DB Browser for SQLite is quick and accurate ! How do you manage to count the number of rows so quickly ?
Anyway the reason for the freeze is not the process which counts the number of rows then, because the freeze is happening after the count is displayed.
Chris Locke
@chrisjlocke

How do you manage to count the number of rows so quickly ?

I knoweth not. I tend to steer clear of the DB4S code ... here be dragons. I know a lot has gone into making it optimised though... tweaks here and there, etc.
The application log in DB4S might assist in working out where the freeze is occurring.
If you don't need to edit records, the 'execute SQL' tab is useful to grab records, and that should be speedy. The grid is 'read only' which isn't always great.

But I thought it was impossible to have an accurate count of the number of rows quickly (i.e. faster than "select count(*)" which is taking ages

Depends on the version of SQLite. I know SQLite itself (so not DB4S) has sexy internal methods to make this query fast. Depending on the table contents, it can be quicker doing a 'select (1)' rather than a 'select (*)' which would pull in a load of data, only to discard it.

jessie-github
@jessie-github
Ok I looked at the Application Log and indeed to count the number of rows, it is using "SELECT COUNT()" and it is returning the result in 3 seconds (I am surprised !).
The next query is "SELECT 'rowID',
from table LIMIT 0; 253" and the GUI is frozen from there (the first lines does not appear). I tried executing this query outside and it is returning the result instantly.
So there is something wrong around this last query but no means for me to debug this further (nothing in the Error Log as well).
Chris Locke
@chrisjlocke

So there is something wrong around this last query but no means for me to debug this further

That's useful to know - thanks for digging that out.
I would have suggested raising an official issue for it, as it's something we'd like to get fixed, but development has ... er, kinda stalled at the moment and the issue count is >470, so it's going to get lost in the noise for a while, so won't get solved in the short-term.

jessie-github
@jessie-github
No problem, I'll open it and at least we'll have a trace. It is very easy to reproduce. Thanks for your answers
jessie-github
@jessie-github
Justin Clift
@justinclift
Cool. Yeah, DB4S isn't optimised for large databases, even though SQLite itself is a decent enough format for transporting the data between systems.
At present, DB4S loads things into ram and doesn't clear out old loaded data from ram. So, as you scroll through stuff, ram usage just goes higher and higher.
With a (say) 1GB database, you could be using 4GB ram to view it. And ditto, with a 2GB database, you could be using some multiple of 2GB ram to keep it in memory. So, a 100GB database is going to go badly with DB4S if you try and scroll through the whole thing without several hundred GB's of ram.
... and that's not even counting the SQLite backend's processing time for running queries, though indexes can definitely help there. :)
Justin Clift
@justinclift
... and Qt (the library we use for displaying data) seems to have a maximum # of rows it'll handle in it's widgets (16 mill or so from super rough memory). Someone reported a bug about it a few years ago, which Qt team responded as not being interested in changing "you're doing it wrong" kind of response (my wording). ;)
mseaworthy
@mseaworthy
Wondering if there is a way to import multiple CSVs simultaneously in DB Browser? When I attempt to do this, I appears to combine them and they don't end up as separate tables in the db.