Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Brian McMahon
    @bmackattack
    First happy holidays to all! I wanted to ask if anybody else has had an issue with keeping defaults values from the original DB? I've tried casting multiple times and it doesn't get built into the DDL in the destination DB? Has anybody else had issues keeping the default value in the DDL transferred to the destination DB?
    sasaraf
    @sasaraf
    hi, we want to load 'bytea' type table with hex string, however, instead of inserting 9116452417 we get 3931313645324137 . would u know? (we are migrating from oracle to postgresql and 'raw' data type was suggested to convert to 'bytea', should we define the bytea column differently?)
    sasaraf
    @sasaraf
    we have tried to alter the database to have bytea_output set to 'hex' but we couldn't overcome the above without adding '\x<hex[string>', is there a way to add '\x' fixed into the pgloader script?
    Max
    @Bonn93
    @dimitri I commented on #998 on Github, which possibly solves a few SSL errors I had, wondering your thoughts on adding the extra dev libs to the build?
    Joe Koberg
    @jkoberg
    Hello folks. Seems like I can't get pgloader to finish without hanging at the end of the process
    I am copying a mysql DB to a local postgres DB, and everything seems to go great, but then all CPU and network activity stops and nothing seems to happen
    I can't seem to get a clean run without some error like the following popping up, but it proceeds after that point so I assume it retries
    debugger invoked on a PGLOADER.CONNECTION:DB-CONNECTION-ERROR in thread    
    #<THREAD "lparallel" RUNNING {10064C0FD3}>:
      Failed to connect to mysql at ... The condition Socket error in "connect": EINTR (Interrupted system call) occurred with errno: 0.
    
    The current thread is not at the foreground,
    SB-THREAD:RELEASE-FOREGROUND has to be called in #<SB-THREAD:THREAD "main thread" RUNNING {10005284C3}>
    for this thread to enter the debugger.
    Joe Koberg
    @jkoberg
    well, this time it went through. thanks anyway!
    Jérémy Leherpeur
    @amenophis
    Hi,
    Is it possible to install 3.6.1 version on ubuntu without compile ?
    Christoph Berg
    @df7cb
    @amenophis: 3.6.1 builds on eoan only, you can install it from apt.postgresql.org (and on focal once we start supporting that)
    Jérémy Leherpeur
    @amenophis
    @df7cb thanks
    I compiled it with a patch for PG12 on each machine which needs pgloader ;)
    Jérémy Leherpeur
    @amenophis

    Hi,
    I tried to run this command

    ./pgloader/build/bin/pgloader -v \
    --context ./pgloader.ini \
    --summmary ./summary.log \
    file1.load \
    file2.load \
    file3.load

    The issue is the summary.log file contains only summary for file3.
    Did I miss something ?

    Emre Küçük
    @emrephi
    sb-impl::default-external-format :UTF-8
    tmpdir: #P"/tmp/pgloader/"
    2020-02-28T14:08:08.031000Z NOTICE Starting pgloader, log system is ready.
    2020-02-28T14:08:08.055000Z INFO Starting monitor
    2020-02-28T14:08:08.063000Z INFO Stopping monitor
    i am executing the pgloader command and there is no table created in postgresql database
    and i get the above logs
    AndreasLuka
    @AndreasLuka
    I am converting legacy database (dBASE III) with encoding IBM (german) using :
    AndreasLuka
    @AndreasLuka

    LOAD DBF
    FROM 'daten.dbf' WITH ENCODING CP850
    INTO postgresql:///staging
    WITH truncate, create table

    and it works well with only a small glitch. Beside the table 'public.daten', I also got a schema 'daten' with no dependencies at all.

    iff133
    @iff133
    has the --config option been deprecated?
    I am trying to use a .load but every time i ran pgloader csv-writer.load I get an error saying: No such file or directory
    I am using pgloader version "3.6.2"
    Jame
    @jamemackson
    we are having an issue where we have some images stored in a database that we're moving from mssql into pg and the fields storing the image data seems to be truncating at 4096 - any ideas if there is an option we need to set to allow larger chunks of data to pass through?
    1 reply
    utbdef
    @utbdef
    Hello everyone and hello @dimitri
    I have a question about pgloader. I want to import a table from MSSQL database to a PGSQL.
    I use pgloader version 3.6.3~devel from docker image dimitri/pgloader:latest with a modified freetds conf explained in pgloader readthedoc
    I launch pgloader like this
    LOAD DATABASE
    FROM {{DB_SOURCE}}
    INTO {{DB_TARGET}}
    INCLUDING ONLY TABLE NAMES LIKE '{{TABLE_NAME}}' IN SCHEMA 'dbo';
    According to debug log, it create the table, read the source data but imported nothing
    In the log i found
    Finished processing WRITER for "dbo.csection" 31.060000s
    writers-counts[dbo.csection] = 0
    The summary said it had read x data but 0 imported
    It may be difficult for you to help me with those details but if you have any clue it will be great !
    It also may be a classical issue that i don't found any solution
    Thanks in advance
    yacovg
    @yacovg
    Hi, is it possible to limit the number of copied rows?
    Karthick
    @karthickcs06

    Hello everyone,
    I have a question about an issue am facing on. Am very new to pgloader, so any inputs will really help me.

    1) Getting an error as " the octet sequence #(195) cannot be decoded." It is because of the string getting in the source file "Women’s, Girls’, and Infants’ Cut and Sew Apparel Manufacturing ". Is there a way to handle this in load file and load data, without replacing the character in source file ?

    2) Also, the file has 1000 rows and this error happened on 425th row.. when the error happened it just stopped processing and did not continue. Is it possible to force ignore bad record & continue to next row ?

    My Load file below

    LOAD CSV
    FROM INLINE with encoding 'ascii'
    (
    Name, Value
    )
    INTO postgresql://user:pwd@server:port/db?schema.table
    (
    Name, Value
    )

    WITH skip header = 1,
          batch rows = 200,
          prefetch rows = 200,
          batch size = 1024 kB,
          batch concurrency = 3,
          fields escaped by '\',
          fields terminated by '|'
    
    SET client_encoding to 'utf8',
        standard_conforming_strings to 'on'

    BEFORE LOAD DO
    droptableifexistsschema.table; drop table if exists schema.table; ,
    dropINDEXifexistsschema.tableidx; drop INDEX if exists schema.table_idx; ,
    createtableifnotexistsschema.table(Namevarchar(80),Valuevarchar(255)); create table if not exists schema.table ( Name varchar(80), Value varchar(255) );

    029327952|"Women’s, Girls’, and Infants’ Cut and Sew Apparel Manufacturing "
    029327962|TEST

    Rifhan Akram
    @rifhanakram

    Hello everyone,

    I'm trying a migration from MSSQL to Postgresql. I'm getting an error as in this issue dimitri/pgloader#1018. I noticed the fix is in v3.6.2 but when I do apt-get install on Debian I get v3.6.1.

    am I missing something ? could someone please help me with this ? should I build from source? :)

    Rifhan Akram
    @rifhanakram

    Hello everyone,

    I'm trying a migration from MSSQL to Postgresql. I'm getting an error as in this issue dimitri/pgloader#1018. I noticed the fix is in v3.6.2 but when I do apt-get install on Debian I get v3.6.1.

    am I missing something ? could someone please help me with this ? should I build from source? :)

    Update -:
    I was able to build version 3.6.2 from source and that resolved the issue. I'm not sure if 3.6.2 should be available in apt-get repository or if its intended to not have it ?

    mase - meat popsicle
    @mase_nocturnal_twitter
    Does anyone know if in a command file, I am able to choose which src database columns are included ?

    @rifhanakram I'm in the process of trying to do the same. :)

    The new versions of software won't typically appear in APT until the maintainer pulls in the change

    mase - meat popsicle
    @mase_nocturnal_twitter
    image.png
    Hey all, using the latest docker image of pgloader drops me in to ldb
    I am not really sure how to proceed from here
    can anyone give any guideance ?
    mase - meat popsicle
    @mase_nocturnal_twitter
    @mcgri Yes, I'm currently battling another issue, but I got past that stage
    is it possible for me to choose which mssql (src) columns I want to include ?
    mase - meat popsicle
    @mase_nocturnal_twitter
    anyone.... ?
    Dario Frongillo
    @esoni
    HI All
    @dimitri i love your postgres book :D
    hey guys .. i m reading a field of a csv that have this format 999999999D99 and it use as thousand separator the char , and for decimal the point
    is there a way to transform it in number during an import csv ?
    Dario Frongillo
    @esoni
    hi all, someone can help me in this issue dimitri/pgloader#1192 ?
    Keerthikan Ratnarajah
    @Keerthikan
    is this active?
    Benjamin David
    @benmod_gitlab
    Hey! I'm having this error "there is no unique constraint matching given keys for referenced table" when trying to load form SQLite to Postgres
    but there is the proper constraint in sqlite
    so I dont understand
    Benjamin David
    @benmod_gitlab
    It seems that the primary keys are not migrated
    hence causing the issues
    or the Unique constraint of the primary key is not passed
    is there any wya to force that?
    I looked over the whole doc and cant find it
    Benjamin David
    @benmod_gitlab