Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 14:19
    pchemguy commented #2560
  • 14:19
    pchemguy commented #2560
  • 11:31
    justinclift commented #2560
  • 07:24
    scottfurry commented #2560
  • 07:20
    scottfurry commented #2560
  • 07:19
    singalen commented #2560
  • 07:14
    scottfurry commented #2560
  • 07:14
    scottfurry commented #2560
  • 07:13
    singalen commented #2560
  • 07:12
    singalen commented #2560
  • 07:10
    scottfurry commented #2560
  • 07:09
    singalen commented #2560
  • 06:47
    singalen commented #2560
  • Jan 23 18:07
    JC3 commented #2950
  • Jan 23 18:04
    JC3 commented #2950
  • Jan 23 18:04
    JC3 commented #2950
  • Jan 23 17:59
    JC3 commented #2950
  • Jan 23 17:56
    JC3 commented #2950
  • Jan 23 17:55
    JC3 commented #2950
  • Jan 23 17:55
    JC3 commented #2950
TheDemLabs
@TheDemLabs
@chrisjlocke Hi Chris, Any suggestions on the easiest way to share data to non-technical users who will be canvassing offline in the field using their phones? The file size could be upto 1m records but only have 6 columns. Search by Name & Date of Birth. Display Address, Phone, Voting status. Volunteers will make any corrections and then upload them when they have connectivity. Right now we are using SQLite to put the data on the phone but I need a simpler interface for volunteers to work with the data. Thanks.
Chris Locke
@chrisjlocke
The first problem I see is confidential data - surely this would need to be encrypted somehow? If someone left a phone somewhere it would be easy to access this data.
When you say 'corrections', how much entry is entered?
You mention you're using a SQLite database already, but what app is accessing this on the phone currently?
Selvaganesh
@selvaganesh3m
Some one please help me
Any one here
Please I've doubt
Someone please clarify my doubt
TheDemLabs
@TheDemLabs
The data we are using is publicly available data, but we would like to keep it encrypted on the phone if possible. Corrections will be minor such as an incorrect address or phone number. Or they might a record for a person who is an eligible voter but not currently on the rolls. The information that the volunteers collect will be uploaded when they do have connectivity. Other people will screen and verify the volunteers' input before doing anything with that information. We are using the query tool that comes with SQLite (Filter). But I think this will be hard for many non-technical volunteers and am hoping that there is a more user-friendly app or front end we can use with SQLite.
Chris Locke
@chrisjlocke

Someone please clarify my doubt

Apologies - this chat isn't monitored 24/7, but feel free to comment your doubts and one of the team will respond when they can. 👍

Chris Locke
@chrisjlocke

We are using the query tool that comes with SQLite

I'm still not sure what application you're using on your phones to view/amend the database. SQLite Browser (or Database Browser for SQLite to use it's official title) isn't available for Android/iOS.

There is a handy 'repository' called dbhub.io which allows databases to be uploaded and supports revisions, but again, this works within DB4S which isn't available on a mobile device.

This would be a simple app to create - mainly a search form, then select the entry to view the record, allow any updates which would set a 'date updated' flag, so any records updated in the last X days could then be uploaded. Simple in theory...
While my .Net skills are good, I don't have the skills to create a Xaramin application, which would fit your bill nicely.
As an aside, whats the urgency of this - 'we want something now, now, now!!!' or 'we'll need something quick in a couple of weeks', etc. Could be an interesting project if you want to have a specific application written for you!

Would the volunteers be using Android phones, tablets, or Apple hardware? Or a mix?
Selvaganesh
@selvaganesh3m

My doubt is

# fetch all orders that have 'pepperoni' as one of the toppings (either in pizza-topping or in additional-topping)

I've written a query

SELECT
  o.*
FROM
  tom_orders AS o
  INNER JOIN tom_pizza_toppings AS pt ON pt.pizza_id = o.pizza_id
  INNER JOIN tom_toppings AS t ON t.id = pt.topping_id
WHERE
  t.name = 'Pepperoni'
UNION
SELECT
  o.*
FROM
  tom_orders AS o
  INNER JOIN tom_orders_addl_toppings AS addt ON addt.order_id = o.id
  INNER JOIN tom_toppings AS t ON t.id = addt.topping_id
WHERE
  t.name = 'Pepperoni'

This is my query

TheDemLabs
@TheDemLabs
I found SQLite Mobile Client on the Apple Store and downloaded that. There is a version ($3 onetime) that lets you import CSV files. I've hired a developer to help me with the simple form that would be needed for volunteers to conduct queries and add their notes while in offline mode. Volunteers will be using both Apple and Android devices. All good for now. Thanks.
Selvaganesh
@selvaganesh3m
image.png
Chris Locke
@chrisjlocke
So an order (tom_orders) can have one or more tom_toppings which can be either pizza_toppings or addl_toppings. I assume tom_toppings is the main list, while the other two relate to specific orders? I don't think you need a union, but assume that works.
So you've done the query - what is the issue?
Chris Locke
@chrisjlocke

All good for now. Thanks.

Glad you found something that works.

DavidKonicek
@DavidKonicek
Hello. Where to report bugs I found in SQLiteBrowser? Here? Somewhere else? Thanks.
Chris Locke
@chrisjlocke
In DB4S if you click 'Help' and 'Bug Report' it'll take you to the issues page on GitHub. You'll need a GitHub account.
https://bit.ly/submitBug
It's not recommended to submit it here, as it can't be easily followed up or referred to later on. The 'Issues' section on GitHub is the best place.
brightinnovator
@brightinnovator
I have a question on javascript. can someone help me on the javascript issue?
brightinnovator
@brightinnovator

I want to read 3 crore csv rows which is of 2GB csv file size and need to insert into MySQL via Java.

Could someone please help me know the fastest and memory efficient way to avoid out of memory exception as well load in lesser time?

Please kindly advise.

David Reilly
@greyblue9
MySQL or sqlite?
Selvaganesh, what is the result you get? Does it not match your expectation?
David Reilly
@greyblue9
I think you'll have trouble if the coloms in the two topping tables aren't exactly the same unless you explicitly select the common columns.
flywire
@flywire
Is there a demo for creating a report with a GUI?
Chris Locke
@chrisjlocke
There are no reporting capabilities within DB4S - it manages the database only.

I have a question on javascript

This is primarily a support chat area for DB4S which is a database management tool. I don't believe the users familiar in it are also familiar with Javascript.

flywire
@flywire
Can you recommend a GUI report tool(s) that might go with it?
btw, Can I get Gitter to send me an email for responses?
Chris Locke
@chrisjlocke
https://www.helicalinsight.com/open-source-bi-tool-for-sqlite/ ? There are paid-for solutions, but I always tend to drift towards open-source offerings first.

Can I get Gitter to send me an email for responses?

If you click the settings (assuming you're on the desktop browser version) there are notification options in there. Click 'Notifications' and not 'Settings' (once you've clicked the settings icon, I mean...)

flywire
@flywire
:thumbs up: x2
Aaron Mason
@thirdwheel
Is there a way to turn off the suggestion box that pops up as you're entering data in the Browse Data tab? It is useful from time to time but there are times when it slows me down
Chris Locke
@chrisjlocke
image.png
If you click 'Edit' and 'Preferences' then the 'Data Browser' tab, there is a 'threshold for completion' option. Set this to zero and the functionality should be disabled.
1 reply
Carlos W. Mercado
@carloswm85
hELLOW
I was writting some queries for the first time using sqlb
I was wondering, what's the difference between 'String', "String" and String when running any query? I noticed they're all work when asking data from the DB, using those DB names and row names.
Chris Locke
@chrisjlocke

Carlos - You have a number of SQL engines (mySQL, Microsoft, etc) and SQLite tries to help out by being compatible with them as they all work a little differently. This means you could use an apostrophe or quote but SQLite tries to work out what you really mean.
Generally, strings are enclosed in apostrophe's, so select from table where field1='Fred' means it'll find data that matches the string 'Fred'. If you use field1="fred" then it tries to find a column (or table, view, etc) that matches "fred". BUT, if it doesn't find one, SQLite thinks, "aah, did you mean 'Fred'?" so that might also work. You shouldn't rely on these though, and should ideally get the quotation marks correct.
Ideally, all fields and tables should be a singular word without spaces. So don't call a table "table 1". If you do, you have to put quotation marks around it (not apostrophes) when using it in SQL - eg, select
from "table 1" where field1='Blah blah'

Does any of this make any sense?

1 reply
jessie-github
@jessie-github
Hello, it seems that I am running into this issue with version 3.12.1 in Windows with a big database (100GB, one big table): sqlitebrowser/sqlitebrowser#584
I don't have any issue with a 5GB database.
On the "Browser Data" tab, the row count is displaying after a few seconds at the bottom and after that it is frozen with the bottom right message "Busy (reading rows)".
I have set the prefetch value to 255, but it doesn't help.
CPU usage is high but I still have enough RAM available.
Can somebody tell me if I have missed something ? I don't want to re-open the issue and bother the developers if I made a mistake... Thanks !
Chris Locke
@chrisjlocke

I have set the prefetch value to 255

I don't believe this helps in this instance - it only helps when scrolling through the table - it'll grab 255 records at once. Usual values are ~50,000 so it grabs 50,000 records in a go.

Without being rude, why are you using the data browser tab for such a large number of records? At 100 GB, this is going to be a lot of rows ... why would you want these all displayed? Usually, once the data is in the database, you perform actions on that data - how many records with a value of X in field Y, how many records with a value of Y in field Z, etc. You can perform that using SQL - eg, you don't view all million rows to find out there are a million rows. If you're scrolling through data, you can use 'limit' to make this manageable.
In my eyes, this is like opening a 900,000 page word document, then wondering why it takes a while to scroll up and down. Some will argue that DB4S should handle 80 million rows, but I fail to see the advantage of viewing this on screen... (just my personal opinion, of course!)

I don't want to re-open the issue and bother the developers if I made a mistake...

I would recommend opening a new issue, and making reference to the other issue - don't open an older one. The older one refered to a different version of DB4S, plus it will notify the original poster, when really you need your own new shiny issue that belongs just to you. Open it up as it might highlight something wrong with DB4S that can be fixed.
There is already an open issue where the number of rows visible in the grid is limited - possible due to the control we use in DB4S. This comes back to my (waaaaaaaay too long) point above that displaying a large number of results is a nuts thing to do anyway.

jessie-github
@jessie-github
This is a valid question. The database is a dump of important information that we want preserved and available for query (i.e. an archive) and/or to be used as an input for other tools (i.e an alternative to CSV files). However in some case, we want to be able to query the database (it is indexed of course), to look at a particular record. For instance if we have a bug somewhere, we may want to look at the record and its surrounding.
Perhaps this is an edge use case, but I am trying to promote SQLite over CSV files (for large exchange files), and people need to have a look at the data before it is processed.
(for instance on emacs I can open a 100GB CSV file without loading the complete file in memory, so this is instantaneous)
Chris Locke
@chrisjlocke
My point is (or was...) is someone going to load a grid with twenty million records and scroll through the lot? Or are they going to narrow this down into chunks.
You can't compare DB4S to emacs - it's like my example above - Word wouldn't open a 90,000 page document.
An SQLite database is a million times better than a CSV for querying - a database is built for querying, while a CSV isn't.
The 'browse data' tab is designed for small tables as it loads every record and relies on a filter. What would be better (for your example) is a grid asking 'what do you want to look at?' (eg, a numeric range or date range) so you can scroll through that. Again, a subset, rather than the complete table.
jessie-github
@jessie-github
Indeed and this what I am often doing as well. Sometimes I just want to browse through the first hundred of lines to see what is the content and other times I want to filter to have only have rows corresponding to the criteria. Problem here is that the GUI is frozen and I cannot enter any filter at the top of the column.
I am using (abusing ?) it a bit like I would use Excel to browse/filter/display a table with a lot of data.
For debugging it is really useful. Manipulating large CSV files (as input for other tools), it is really useful to be able to open the file (without loading the entire file in memory) and explore a bit the content of the file to see how to parse it (for instance to check that the date are of the standard YYYY-MM-DD instead of MM/DD/YYYY).
Similarly for SQLite file, I wanted to have a peek without having to (manually) type a query to display the first hundreds of lines.
jessie-github
@jessie-github
But if DB Browser for SQLite is really built in a way to forcibly load the entire database when using the "Browse Data" tab, no problem I'll try to find another way. I just thought that it would be useful to offer the possibility to load in chunks (like the emacs vlf package) but I can understand that it is not a feature which is often requested.
Chris Locke
@chrisjlocke

But if DB Browser for SQLite is really built in a way to forcibly load the entire database

Not intentionally it doesn't - there is a lot of work to make it as smooth as possible (the reason for the 'prefetch block size') but the first thing it has to do is work out how many rows the table has, so it does a 'select count(*) from table', and it's this that can hammer large databases. Even with an index, a 100 GB database is going to take time to get this figure.