Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Joe Koberg
    I can't seem to get a clean run without some error like the following popping up, but it proceeds after that point so I assume it retries
    debugger invoked on a PGLOADER.CONNECTION:DB-CONNECTION-ERROR in thread    
    #<THREAD "lparallel" RUNNING {10064C0FD3}>:
      Failed to connect to mysql at ... The condition Socket error in "connect": EINTR (Interrupted system call) occurred with errno: 0.
    The current thread is not at the foreground,
    SB-THREAD:RELEASE-FOREGROUND has to be called in #<SB-THREAD:THREAD "main thread" RUNNING {10005284C3}>
    for this thread to enter the debugger.
    Joe Koberg
    well, this time it went through. thanks anyway!
    Jérémy Leherpeur
    Is it possible to install 3.6.1 version on ubuntu without compile ?
    Christoph Berg
    @amenophis: 3.6.1 builds on eoan only, you can install it from apt.postgresql.org (and on focal once we start supporting that)
    Jérémy Leherpeur
    @df7cb thanks
    I compiled it with a patch for PG12 on each machine which needs pgloader ;)
    Jérémy Leherpeur

    I tried to run this command

    ./pgloader/build/bin/pgloader -v \
    --context ./pgloader.ini \
    --summmary ./summary.log \
    file1.load \
    file2.load \

    The issue is the summary.log file contains only summary for file3.
    Did I miss something ?

    Emre Küçük
    sb-impl::default-external-format :UTF-8
    tmpdir: #P"/tmp/pgloader/"
    2020-02-28T14:08:08.031000Z NOTICE Starting pgloader, log system is ready.
    2020-02-28T14:08:08.055000Z INFO Starting monitor
    2020-02-28T14:08:08.063000Z INFO Stopping monitor
    i am executing the pgloader command and there is no table created in postgresql database
    and i get the above logs
    I am converting legacy database (dBASE III) with encoding IBM (german) using :

    FROM 'daten.dbf' WITH ENCODING CP850
    INTO postgresql:///staging
    WITH truncate, create table

    and it works well with only a small glitch. Beside the table 'public.daten', I also got a schema 'daten' with no dependencies at all.

    has the --config option been deprecated?
    I am trying to use a .load but every time i ran pgloader csv-writer.load I get an error saying: No such file or directory
    I am using pgloader version "3.6.2"
    we are having an issue where we have some images stored in a database that we're moving from mssql into pg and the fields storing the image data seems to be truncating at 4096 - any ideas if there is an option we need to set to allow larger chunks of data to pass through?
    1 reply
    Hello everyone and hello @dimitri
    I have a question about pgloader. I want to import a table from MSSQL database to a PGSQL.
    I use pgloader version 3.6.3~devel from docker image dimitri/pgloader:latest with a modified freetds conf explained in pgloader readthedoc
    I launch pgloader like this
    According to debug log, it create the table, read the source data but imported nothing
    In the log i found
    Finished processing WRITER for "dbo.csection" 31.060000s
    writers-counts[dbo.csection] = 0
    The summary said it had read x data but 0 imported
    It may be difficult for you to help me with those details but if you have any clue it will be great !
    It also may be a classical issue that i don't found any solution
    Thanks in advance
    Hi, is it possible to limit the number of copied rows?

    Hello everyone,
    I have a question about an issue am facing on. Am very new to pgloader, so any inputs will really help me.

    1) Getting an error as " the octet sequence #(195) cannot be decoded." It is because of the string getting in the source file "Women’s, Girls’, and Infants’ Cut and Sew Apparel Manufacturing ". Is there a way to handle this in load file and load data, without replacing the character in source file ?

    2) Also, the file has 1000 rows and this error happened on 425th row.. when the error happened it just stopped processing and did not continue. Is it possible to force ignore bad record & continue to next row ?

    My Load file below

    FROM INLINE with encoding 'ascii'
    Name, Value
    INTO postgresql://user:pwd@server:port/db?schema.table
    Name, Value

    WITH skip header = 1,
          batch rows = 200,
          prefetch rows = 200,
          batch size = 1024 kB,
          batch concurrency = 3,
          fields escaped by '\',
          fields terminated by '|'
    SET client_encoding to 'utf8',
        standard_conforming_strings to 'on'

    droptableifexistsschema.table; drop table if exists schema.table; ,
    dropINDEXifexistsschema.tableidx; drop INDEX if exists schema.table_idx; ,
    createtableifnotexistsschema.table(Namevarchar(80),Valuevarchar(255)); create table if not exists schema.table ( Name varchar(80), Value varchar(255) );

    029327952|"Women’s, Girls’, and Infants’ Cut and Sew Apparel Manufacturing "

    Rifhan Akram

    Hello everyone,

    I'm trying a migration from MSSQL to Postgresql. I'm getting an error as in this issue dimitri/pgloader#1018. I noticed the fix is in v3.6.2 but when I do apt-get install on Debian I get v3.6.1.

    am I missing something ? could someone please help me with this ? should I build from source? :)

    Rifhan Akram

    Hello everyone,

    I'm trying a migration from MSSQL to Postgresql. I'm getting an error as in this issue dimitri/pgloader#1018. I noticed the fix is in v3.6.2 but when I do apt-get install on Debian I get v3.6.1.

    am I missing something ? could someone please help me with this ? should I build from source? :)

    Update -:
    I was able to build version 3.6.2 from source and that resolved the issue. I'm not sure if 3.6.2 should be available in apt-get repository or if its intended to not have it ?

    mase - meat popsicle
    Does anyone know if in a command file, I am able to choose which src database columns are included ?

    @rifhanakram I'm in the process of trying to do the same. :)

    The new versions of software won't typically appear in APT until the maintainer pulls in the change

    mase - meat popsicle
    Hey all, using the latest docker image of pgloader drops me in to ldb
    I am not really sure how to proceed from here
    can anyone give any guideance ?
    mase - meat popsicle
    @mcgri Yes, I'm currently battling another issue, but I got past that stage
    is it possible for me to choose which mssql (src) columns I want to include ?
    mase - meat popsicle
    anyone.... ?
    Dario Frongillo
    HI All
    @dimitri i love your postgres book :D
    hey guys .. i m reading a field of a csv that have this format 999999999D99 and it use as thousand separator the char , and for decimal the point
    is there a way to transform it in number during an import csv ?
    Dario Frongillo
    hi all, someone can help me in this issue dimitri/pgloader#1192 ?
    Keerthikan Ratnarajah
    is this active?
    Benjamin David
    Hey! I'm having this error "there is no unique constraint matching given keys for referenced table" when trying to load form SQLite to Postgres
    but there is the proper constraint in sqlite
    so I dont understand
    Benjamin David
    It seems that the primary keys are not migrated
    hence causing the issues
    or the Unique constraint of the primary key is not passed
    is there any wya to force that?
    I looked over the whole doc and cant find it
    Benjamin David
    here's my empty DB
    just migrating the schema fails
    whereas its pretty straightforward
    Hi I am trying to load data from MS SQL server to Postgres, both password have special characters, for postgres I am using PGPASSWORD environment variable , what is the environment variable for MSSQL server password it tried PWD, PASSWORD it does not work
    Manoj Jadhav

    HI All,

    I am facing this issue?

    31: ("foreign function: call_into_lisp")
    can any one guide me here?