Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    jbwaclawski
    @jbwaclawski
    I'm trying to perform an upsert to MySQL using on_duplicate_key_update... how do I do this for bulk data? Does it accept bulk datasets in the call or do I need to manually loop through everything in the dataset?
    Erwin
    @erwinkinn
    Dear community, could you please help me to clarify one thing... Is it possible to load the entire database from YAML using this SQLAthanor lib? I'd been through the documentation but I'm staying confused with it. Looks like this must be a basic feature, but there're only the possibilities like "load a model from..." or "update a model from...". Maybe they just a have a lack of documentation or something...
    IsHaltEchtSo
    @IsHaltEchtSo
    Is it possible to build an undirected, many-to-many relationship with only one table and an association table? Like a UserModel where two User can be friends; if User A is a friend of User B, User B has a friend in User A as well. I can only do it directed - like follower/following - but not undirected as of now... for more info, here's the full issue: https://stackoverflow.com/questions/73099995/undirected-many-to-many-relationship-with-one-table
    Federico Caselli
    @CaselIT
    Not sure. I suggest you open a discussion
    IsHaltEchtSo
    @IsHaltEchtSo
    Gotcha, thanks! :)
    Montana Burr, Software Engineer
    @moonman239_twitter

    So, I have a bit of an interesting problem. For reasons I won't get into here, I want to store binary data in a database. When a client sends me a value, my program should check whether that value, encoded in binary data, is stored in the database.

    Here is somewhat of an MCVE:

    from http.server import BaseHTTPRequestHandler, HTTPServer
    import json
    
    from sqlalchemy import BINARY, Column, Date, ForeignKey, LargeBinary, create_engine, func, select
    import sqlalchemy
    from sqlalchemy.engine.base import Engine
    from sqlalchemy.orm import declarative_base
    from sqlalchemy import Column, String, Integer, DateTime
    from sqlalchemy.orm import relationship, Session
    from sqlalchemy import insert
    
    Base = declarative_base()
    engine = create_engine(
        "mysql+pymysql://user:password@localhost/api_database")
    
    
    class Token(Base):
        __tablename__ = "Tokens"
        token = Column(BINARY(64), primary_key=True, nullable=False)
    
    
    Base.metadata.create_all(engine)
    
    with Session(bind=engine) as session:
        # add token
        session.add(Token(token="$2b$2t.e".encode("utf8")))
        session.commit()
    
    
    class Server(BaseHTTPRequestHandler):
    
        def do_POST(self):
            input_length = int(self.headers.get("Content-Length"))
            input = self.rfile.read(input_length)
            input_json = json.loads(input)
            # check if passed token exists in table
            resultset = engine.execute(
                select(func.count(Token.token)).where(
                    Token.token == input_json["token"].encode("utf8")))
            result_count = resultset.first()
            if (result_count[0] > 0):
                # found the token
                self.send_response(200)
                self.send_header("Content-Type", "application/json")
                self.end_headers()
                self.wfile.write(
                    json.dumps({
                        "success": True
                    }).encode("iso-8859-1"))
            else:
                # token not found
                self.send_response(400)
                self.send_header("Content-Type", "application/json")
                self.end_headers()
                self.wfile.write(
                    json.dumps({
                        "success": False
                    }).encode("iso-8859-1"))
    
    
    if __name__ == "__main__":
        web_server = HTTPServer(("0.0.0.0", 8080), Server)
        print("Server started at {0}:{1}".format("0.0.0.0", 8080))
        try:
            web_server.serve_forever()
        except KeyboardInterrupt:
            pass
        web_server.server_close()
        print("Server stopped.")
    Oh, I forgot the problem:
    When the client passes "$2b$2t.e" to the server, the server doesn't see that the value is in the database.
    Alex Grönholm
    @agronholm
    @moonman239_twitter where are you saving the new token in the database?
    I see you do that in one place at the beginning – I suppose you only save a static token in the example?
    so does resultset always come back with a result of 0?
    Montana Burr, Software Engineer
    @moonman239_twitter
    @agronholm yeah, only a static token in the example. It would seem that SQLAlchemy has a hard time finding binary values in databases; I have a more complex program that uses LargeBinaries and I get the same problem.
    Alex Grönholm
    @agronholm
    sqlalchemy is hardly the issue there
    are you sure the problem isn't that you're using a fixed length binary field but only using a fraction of that space?
    I've never used those column types myself
    Montana Burr, Software Engineer
    @moonman239_twitter
    @agronholm maybe?
    Alex Grönholm
    @agronholm
    have you tried finding the token with a raw SQL query?
    Montana Burr, Software Engineer
    @moonman239_twitter
    So far, the only solution that I know of is to store the token as TEXT instead of BINARY.
    Gord Thompson
    @gordthompson
    @moonman239_twitter re: "It would seem that SQLAlchemy has a hard time finding binary values in databases" — Not as such:
    import sqlalchemy as sa
    
    engine = sa.create_engine("mysql+pymysql://scott:tiger@127.0.0.1:3307/mydb")
    
    table_name = "moonman239"
    
    with engine.begin() as conn:
        conn.exec_driver_sql(f"DROP TABLE IF EXISTS {table_name}")
        conn.exec_driver_sql(f"CREATE TABLE {table_name} (token MEDIUMBLOB)")
        conn.exec_driver_sql(
            f"INSERT INTO {table_name} (token) VALUES (X'476F7264')"
        )
    
    tbl = sa.Table(table_name, sa.MetaData(), autoload_with=engine)
    
    engine.echo = True
    with engine.begin() as conn:
        result = conn.execute(sa.select(tbl).where(tbl.c.token == b"\x47\x6f\x72\x64")).one()
        print(result)  # (b'Gord',)
        """SQL emitted:
        SELECT moonman239.token 
        FROM moonman239 
        WHERE moonman239.token = %(token_1)s
        [generated in 0.00077s] {'token_1': b'Gord'}
        """
    Gord Thompson
    @gordthompson

    @moonman239_twitter - BINARY(64) columns are right-padded with nulls, so you'd need to pad out your search term:

        token = b"\x47\x6f\x72\x64"
        token = token + (b"\00" * (64 - len(token)))
        result = conn.execute(sa.select(tbl).where(tbl.c.token == token)).one()

    VARBINARY might be a better choice.

    morenonatural
    @morenonatural:envs.net
    [m]

    hey, y'all … I'm trying to do use async API, but register_composites() is throwing an error with a postgresql+asyncpg engine

    down the stack, it complains TypeError: AsyncAdapt_asyncpg_connection.cursor() got an unexpected keyword argument 'cursor_factory'

    Montana Burr, Software Engineer
    @moonman239_twitter
    @gordthompson or, alternatively, planning the size a bit better
    morenonatural
    @morenonatural:envs.net
    [m]
    any tips would help. thanks
    I even tried sqlalchemy_utils.register_composites(conn.sync_connection), with conn.sync_engine.begin() as sync_conn: … & sqlalchemy_utils.register_composites(await conn.get_raw_connection()) without any luck
    (more or less the same error: async-connection does not implement XXX correctly / not passing sync-connection)
    Alex Grönholm
    @agronholm
    @morenonatural:envs.net what code raised that error?
    morenonatural
    @morenonatural:envs.net
    [m]
    @agronholm:
    engine was created with create_async_engine(, future=True) below. I left out filenames from my private repo
    ….py:110: in create_configuration
        async with app.db.begin() as conn:
    ../../homebrew/Cellar/python@3.10/3.10.4/Frameworks/Python.framework/Versions/3.10/lib/python3.10/contextlib.py:199: in __aenter__
        return await anext(self.gen)
    app/db.py:80: in begin
        await conn.run_sync(sqlalchemy_utils.register_composites)
    ../../../.local/share/virtualenvs/sqlalchemy-asyncio-8jUFBPyQ/lib/python3.10/site-packages/sqlalchemy/ext/asyncio/engine.py:546: in run_sync
        return await greenlet_spawn(fn, conn, *arg, **kw)
    ../../../.local/share/virtualenvs/sqlalchemy-asyncio-8jUFBPyQ/lib/python3.10/site-packages/sqlalchemy/util/_concurrency_py3k.py:115: in greenlet_spawn
        result = context.switch(*args, **kwargs)
    ../../../.local/share/virtualenvs/sqlalchemy-asyncio-8jUFBPyQ/lib/python3.10/site-packages/sqlalchemy_utils/types/pg_composite.py:325: in register_composites
        register_psycopg2_composite(
    ../../../.local/share/virtualenvs/sqlalchemy-asyncio-8jUFBPyQ/lib/python3.10/site-packages/sqlalchemy_utils/types/pg_composite.py:271: in register_psycopg2_composite
        psycopg2.extras.register_composite(
    ../../../.local/share/virtualenvs/sqlalchemy-asyncio-8jUFBPyQ/lib/python3.10/site-packages/psycopg2/extras.py:1136: in register_composite
        caster = factory._from_db(name, conn_or_curs)
    ../../../.local/share/virtualenvs/sqlalchemy-asyncio-8jUFBPyQ/lib/python3.10/site-packages/psycopg2/extras.py:1073: in _from_db
        conn, curs = _solve_conn_curs(conn_or_curs)
    _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
    
    conn_or_curs = <AdaptedConnection <asyncpg.connection.Connection object at 0x1068e6a40>>
    
        def _solve_conn_curs(conn_or_curs):
            """Return the connection and a DBAPI cursor from a connection or cursor."""
            if conn_or_curs is None:
                raise psycopg2.ProgrammingError("no connection or cursor provided")
    
            if hasattr(conn_or_curs, 'execute'):
                conn = conn_or_curs.connection
                curs = conn.cursor(cursor_factory=_cursor)
            else:
                conn = conn_or_curs
    >           curs = conn.cursor(cursor_factory=_cursor)
    E           TypeError: AsyncAdapt_asyncpg_connection.cursor() got an unexpected keyword argument 'cursor_factory'
    
    ../../../.local/share/virtualenvs/sqlalchemy-asyncio-8jUFBPyQ/lib/python3.10/site-packages/psycopg2/extras.py:787: TypeError
    ============================================================================================== short test summary info ==============================================================================================
    ERROR app/catalog_search_service/test/test_config_api.py::test_update_config - TypeError: AsyncAdapt_asyncpg_connection.cursor() got an unexpected keyword argument 'cursor_factory'
    Alex Grönholm
    @agronholm
    why is psycopg2 involved in that traceback?
    morenonatural
    @morenonatural:envs.net
    [m]
    (or at least I tried to leave private filenames out)
    Alex Grönholm
    @agronholm
    you left out the URL from the command you used to create the async engine
    morenonatural
    @morenonatural:envs.net
    [m]
    oh, the url is f"postgresql+asyncpg://{db_user}:{db_password}@{db_host}:{db_port}/{db_name}"
    sorry
    this is code that was working before converting everything to async calls
    Alex Grönholm
    @agronholm
    are you sure that register_composites() isn't psycopg2 specific?
    morenonatural
    @morenonatural:envs.net
    [m]
    it's from sqlalchemy_utils module ... it calls psycopg2 stuff underneath
    Alex Grönholm
    @agronholm
    that doesn't work if your driver is asyncpg and not psycopg2
    morenonatural
    @morenonatural:envs.net
    [m]
    I'm using postgresql+asyncpg + create_async_engine() and propagated that ... am I missing something?
    (mypy pointed the incompatibilities and I started by correcting that, this is like the first run for this code)
    Alex Grönholm
    @agronholm
    as I said, register_composites() seems specific to psycopg2 so it won't work if you use another driver
    morenonatural
    @morenonatural:envs.net
    [m]
    will look for psycopg2 code in my repo. thx for the pointers
    Alex Grönholm
    @agronholm
    is register_composites() your code?
    1 reply
    or why are you looking for psycopg2 code in your repo?
    1 reply
    morenonatural
    @morenonatural:envs.net
    [m]
    (about 10 people working in this repo... I'm asking around for any info as nobody covers the whole code)
    morenonatural
    @morenonatural:envs.net
    [m]

    oh, to anyone interested in sqlalchemy_utils.register_composites() + async

    kvesteri/sqlalchemy-utils#611

    Alex Bender
    @alex-bender
    Hello, is it possible to add some option during Sequence creation so upon deletion 'drop sequence CASCADE' would be issued instead of 'drop sequence' in ddl.compile. dialect is psql.
    I'm getting error cannot drop sequence sequence_ticket_number because other objects depend on it
    Alex Grönholm
    @agronholm
    I don't think so
    Alex Bender
    @alex-bender
    is there a way to set this cascade behaviour if Sequence is being used outside of relation?
    ticket_number = db.Column( db.Integer, sequence_ticket_number, server_default=sequence_ticket_number.next_value(), autoincrement=True, nullable=False)
    Alex Grönholm
    @agronholm
    can you find such a thing in the documentation? https://www.postgresql.org/docs/current/sql-createsequence.html
    Alex Bender
    @alex-bender
    this is not exactly about setting that in SQL, this is about generating sql statement for deletion