Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 13:19
    NicolasCARPi closed #3923
  • 13:19
    NicolasCARPi commented #3923
  • 12:20
    danielhatton commented #3923
  • 11:51
    NicolasCARPi closed #3924
  • 11:51
    NicolasCARPi commented #3924
  • 05:54
    sailhobie opened #3924
  • Nov 25 18:03
    NicolasCARPi labeled #3923
  • Nov 25 18:03
    NicolasCARPi commented #3923
  • Nov 25 16:15
    glenryar commented #3921
  • Nov 25 15:10
    danielhatton opened #3923
  • Nov 25 14:20
    NicolasCARPi commented #3703
  • Nov 25 14:20
    NicolasCARPi labeled #3703
  • Nov 25 14:18
    NicolasCARPi commented #3919
  • Nov 25 14:18
    NicolasCARPi commented #3919
  • Nov 25 13:29
    ezorzin commented #3811
  • Nov 25 13:28
    ezorzin commented #3811
  • Nov 25 09:37
    SylvainTakerkart commented #3919
  • Nov 25 09:37
    SylvainTakerkart commented #3919
  • Nov 25 09:26
    killianrochet commented #3919
  • Nov 25 09:09
    killianrochet commented #3703
Nicolas CARPi
@NicolasCARPi
we need to show them the light!
alexander-haller
@alexander-haller
:smile: :+1:
Nicolas CARPi
@NicolasCARPi
this could be cool to add: https://github.com/staabm/phpstan-dba
BTW: current state of the APIV2 branch vs hypernext: 243 files changed, 4264 insertions(+), 6300 deletions(-)
nmontua
@nmontua
I have a bit of a Problem using Borgbackup: everything appears to be configured correctly, no error in the terminal or anything. Borg list also shows me that I do have a backup. It is just that elabctl backup only creates the SQLdump but not the uploaded files zip archive anymore :(. Or does the backup process word fundamentally different now?
alexander-haller
@alexander-haller
@nmontua your second guess - backup process works different now. It uses inkremental backups with borg instead of full backups with zips.
alexander-haller
@alexander-haller
just verify that the borg backups contain your newly uploaded files and you should be good
nmontua
@nmontua
@alexander-haller Thank you for reassuring me. I guess I will have to familiarize myself with borg a little better. If the Elabftw docs could reflect the changes in more detail, it would be greatly appreciated. Incremental backups are a great feature though!
paullysa
@paullysa
Hi :) I've heard that you're working on a new API version. Will it be possible to change next_step and locked status of items and add them to blockchain via REST? Kind regards
Nicolas CARPi
@NicolasCARPi
@paullysa yes. You can PATCH api/v2/experiments/257/steps/53 with in the request body an action: finish to toggle the finished step of a step. You can PATCH /items/432 with an action "Lock" to toggle lock and you can PATCH with an action:bloxberg to send it to blockchain. Basically the WEB UI is now using the api directly, so for 99% of actions that can be done through the web interface, they can also be done through the REST API because it's using the same endpoint ;)
paullysa
@paullysa
sounds great, thank you for the quick reply!
trineldave
@trineldave
Hello everyone, I have installed on local server elabftw, also I have to point certificate path with change line /ssl in YAML. The tutorial notice to elabctl restart, I must to stop container before or just restart while container running is enough to take account of modifications ? thanks a lot , i'm not expert in docker ;)
Nicolas CARPi
@NicolasCARPi
@trineldave elabctl restart will take care of stopping containers for you, no worries.
mbqb
@mbqb
Hi everyone! I try to set up eLab for a research lab with different sites. We would like to share some database entries between sites (which means we want to link the same database entry in experiements on each site). At the moment we decided to design a site as a 'team'.
Now it's possible to see details of attached database entries - but it's neither possible to see the database entries from the other team in the database section nor to link them to a an experiment.
Is this a general design decission or did I do something wrong in the setup?
Nicolas CARPi
@NicolasCARPi
Users need to check the box in ucp to see external items.
mbqb
@mbqb
@NicolasCARPi kudos!
trineldave
@trineldave
@NicolasCARPi OK thanks ;)
besselfunct
@besselfunct
Is it possible to get more detail on setting up SMTP in the install guide? I'm running into errors when I try to send the test email even though I've got SMTP2GO set up. I'm fairly certain that the issue is the Sender Address: field, but that's not covered in the install guide
(I'm sure this is probably assumed knowledge for most sysadmins but it's new to me)
There could also be issues related to the University firewall (in fact there are almost certainly issues related to that), but I'm not certain without a bit more detail on the expected config
Nicolas CARPi
@NicolasCARPi
@besselfunct your first step must be to look at the error in the log instead of guessing what could be the issue. See https://doc.elabftw.net/debug.html
sysopls
@sysopls
Hello all! I have a problem updating elab to the latest version.
We are running elabftw in Docker on ubuntu. I did the Update via elabctl update, and it was completet successfully. After that, we needed to add the SITE_URL variable to the docker yml, so that the WEB UI is working again.
The WEB UI shows, that we have to run the "bin/console db:update" command to finish the update, but when run this command on the ubuntu cli, we get the following permission denied error.
"1227 Access denied; you need (at least one of) the SUPER, SYSTEM_VARIABLES_ADMIN or SESSION_VARIABlES_ADMIN privilege(s) for this operation.
Does anyone know whats going on here? Everything is set up with default config.
sakuraabcdef
@sakuraabcdef
I upgraded elabftw from ver 4.2.4 to 4.3.10 in Ubuntu 18.04. In this new version (not in old version), attached files with Japanese filename is downloaded as that a file with "". For example, the filename is changed to "protocol_.docx" (_ should be in Japanese). What's wrong? How can I solve the problem?
alexander-haller
@alexander-haller
@sysopls I think you run into this: https://github.com/elabftw/elabftw/discussions/3592#discussioncomment-3268365 try the solution and report back
2 replies
Nicolas CARPi
@NicolasCARPi
@sakuraabcdef this is because the filename fallback has to contain only ASCII characters. If you want to download with the original name you'll need to export as a zip. In the zip the file will have the correct characters.
@sakuraabcdef Wait, I figured out something. I was not using the filename fallback. Because only the fallback must be ASCII!!! So now the correct filename will be downloaded, and only the fallback filename is in ascii!
Nicolas CARPi
@NicolasCARPi
besselfunct
@besselfunct
Regarding the error logs. I've switched to trying to use the internal SMTP server that the university runs. I get the following (anonymized) error:
2022/09/12 16:57:05 [error] 133#133: *4796 FastCGI sent in stderr: "PHP message: [2022-09-12T18:57:05.871483+02:00] elabftw.ERROR: {"exception":"[object] (Symfony\\Component\\Mailer\\Exception\\TransportException(code: 0): Unable to connect with STARTTLS: stream_socket_enable_crypto(): Peer certificate CN=`*.pphosted.com' did not match expected CN=`relay-hostname' at /elabftw/vendor/symfony/mailer/Transport/Smtp/Stream/SocketStream.php:171)"} []" while reading response header from upstream, client: IPADDRESS, server: hostname, request: "POST /app/controllers/SysconfigAjaxController.php HTTP/2.0", upstream: "fastcgi://unix:/run/php-fpm.sock:", host: "hostname"
The SMTP server the university uses for this instance is supposed to allow unauthenticated mail sending from internal IP addresses
besselfunct
@besselfunct
From what I can see, elab is trying to verify the cert of the SMTP relay, but it's not getting the appropriate response. Is there a way to do an unauthenticated send?
Nicolas CARPi
@NicolasCARPi
I don't think so. It's bad practice anyway so this won't be fixed.
besselfunct
@besselfunct
OK. I'm investigating other options on this front within the University network. Hopefully I'll find something that works.
besselfunct
@besselfunct
I managed to figure something out. The University was able to make an authorized SMTP account just for the server
Nicolas CARPi
@NicolasCARPi
great :)
noblelabs1
@noblelabs1
Having trouble with letsencrypt and the yml config file. Now I get this error in the log:

2022/09/22 06:14:41 [emerg] 1462#1462: cannot load certificate "/ssl/live/lab-notebook.nobles.app/fullchain.pem": BIO_new_file() failed (SSL: error:02001002:system library:fopen:No such file or directory:fopen('/ssl/live/lab-notebook.nobles.app/fullchain.pem','r') error:2006D080:BIO routines:BIO_new_file:no such file)
nginx: [emerg] cannot load certificate "/ssl/live/lab-notebook.nobles.app/fullchain.pem": BIO_new_file() failed (SSL: error:02001002:system library:fopen:No such file or directory:fopen('/ssl/live/lab-notebook.nobles.app/fullchain.pem','r') error:2006D080:BIO routines:BIO_new_file:no such file)
2022/09/22 06:14:42 [emerg] 1464#1464: cannot load certificate "/ssl/live/lab-notebook.nobles.app/fullchain.pem": BIO_new_file() failed (SSL: error:02001002:system library:fopen:No such file or directory:fopen('/ssl/live/lab-notebook.nobles.app/fullchain.pem','r') error:2006D080:BIO routines:BIO_new_file:no such file)

Please help
noblelabs1
@noblelabs1
here is the yml config line for the path
  • /etc/letsencrypt/live/lab-notebook.nobles.app:/ssl
@noblelabs1 use a bind mount /etc/letsencrypt:/ssl
besselfunct
@besselfunct
@NicolasCARPi I'm finally at a point where I want to start using the server and pushing data to it, but I'm running into some organizational snags.
  1. Is it possible to tag uploaded files? I see that we can make comments, but is it possible to associate a file with a given type of information, say a UV-Vis spectrum from instance?
  2. Does information have to be within the extra_fields parameter of a JSON in order to be searchable with quick search? We'd like to push metadata to the server for each experiment entry in such a way that it will be searchable, but it's not necessary that it be editable by the end user.
  3. Is there a way to run search queries from the python api? I checked the docs and it seems like there should be, but when I check the function definitions, the get_experiment() function doesn't seem to be able to take a search argument
Nicolas CARPi
@NicolasCARPi
@besselfunct
  1. no but it is an interesting idea.
  2. The quicksearch doesn't search in the metadata column. Searching JSON with mysql is not straightforward. Only on the search page you can search for key/value in extra_fields, and yes it needs to be in extra_fields.
  3. Yes, look at the search paramater in Query parameters https://doc.elabftw.net/api/#api-Entity-GetExperiment
besselfunct
@besselfunct
Hey @NicolasCARPi thanks for the response.
Regarding 1, it would make a lot of what I'm trying to do with elab a lot easier if this was implemented, but I think I might be able to find a workaround
  1. If you can't search JSON, is it possible to add our own mysql columns/data for a given experiment type?
  2. I've tried a couple of different syntax variations, but I still can't determine what's meant by the query parameters. Would you be able to provide a syntax example? I tried the following:
    exp.search("string") exp = manager.get_experiment(id=3, search="string") exp = manager.get_experiment(3, search="string")
Nicolas CARPi
@NicolasCARPi
  1. Absolutely not. Letting users mess the database structure is a recipe for disaster. The better approach would be to add the metadata in the searched columns, but that doesn't seem straightforward to do: https://stackoverflow.com/questions/56753870/mysql-full-text-search-on-extracted-json-column
  1. I'm not sure you can do it with elabapy. But it isn't difficult to bypass it and directly use requests lib to query the endpoint. After all, it really is just a simple GET request with the token as Authorization header.
PoRayChen
@PoRayChen
Hello I am trying to run elabftw on a window docker but then I try to run the container there will be an error message saying "Could not find mandatory site_url variable! Please have a look at the changelog." It will take me to the verison 4.3.0 and it say something about edit the configuration file (/etc/elabftw.yml) however I can't find the configuration file.
Nicolas CARPi
@NicolasCARPi
@PoRayChen it's the same file as docker-compose.yml.
besselfunct
@besselfunct
@NicolasCARPi One of the main goals of my thesis project is to auto-generate experimental records from instruments, with rich metadata that we can then go back to and perform large-scale analysis on. To that end, it would be nice for us to be able to execute search queries based on this rich metadata.
For example, say I want all experiments where the material is CdSe and they have a UV-Vis spectrum, and then I want to grab the UV-Vis spectrum and do some data analysis on it to get a high level understanding (extract peak position and FWHM for example).
As I understand it this is only kind of possible with the current scheme, and I have to do it in a somewhat hacky way. I think I would have to add a material parameter, a hasSpectrum parameter, and a filename parameter to extra_fields in order to accomplish this, and then I would have to execute multiple queries, since you can only do exact matches(?) with the extra_fields query.
Is there a better way to do this, or is this the current "best way"?
(also, to be clear, I'm not trying to say, "can you please add feature X so we can do things my way?", I'm just not a developer so I'm not sure what is and isn't possible in this system)
Nicolas CARPi
@NicolasCARPi
Another approach would be to simply use the body but in a structured way that you define. body will allow full text search, so if you're consistant with how you input data it'll work fine. You could do material:BLAH and hasSpectrum:YES and directly search for that.
sakuraabcdef
@sakuraabcdef
@NicolasCARPi Thank you. I replaced elabftw/src/controllers/DownloadController.php with the new one and restarted the docker container. I can download the files with Japanese filenames.
garrettcurley
@garrettcurley
@besselfunct @NicolasCARPi Regarding recording data from instruments and rich metadata, I've come to the conclusion that, for our experimental purposes at least, a tabular format is more appropriate/intuitive as opposed to the notebook format of eLabFTW. The combination of notebook entries + spreadsheet recording of parameters and data is how most experimental research is done in labs without electronic notebooks, so combining eLabFTW with a tabular application makes sense for me at the moment. Spreadsheets are cumbersome however, so we are using a database application developed by a colleague in Russia, that allows for easy filtering and linking to data (spectra, etc). As much as I would love for eLabFTW to be able to handle this and have just one application to deal with, I haven't found a workflow that makes it possible. Using templates with fields in the 'body' or 'extra_fields', combined with API interfacing could probably solve this, but I haven't had time to test this and I think it would need quite a lot of time. So we are using eLabFTW for general notes (setup, modifications, observations) and manually referencing where in our database software we are recording the metadata and instrument data. I hope to eventually use the API to better integrate both approaches in our workflow. I hope this makes sense and that I'm not coming across as too critical/dismissive of eLab's capabilities.
besselfunct
@besselfunct
@garrettcurley I'm interested in the approach you're talking about, but I'm also not sure how easy it would be to integrate into my current model. One of the main reasons that I want to switch to elabFTW with automation is to eliminate issues caused by..."non-compliance" within the laboratory. So my current vision is to use NodeRED for instrument control/UI, with several other applications on the back end, and then push all of the experimental data to a new elabFTW experiment once an experiment_finished parameter evaluates to true. I don't strictly need all of the metadata tagging that I'm talking about, but it would be nice for facilitating ML and trend analysis on our data as no one has really done it very well in our field (datasets are much too small).