feel free to hang here, note that some of the "chat" happens on libera.chat #sqlalchemy
async with aiofiles.open(
directory, mode="r", encoding="utf-8", newline=""
) as afp:
async for new in AsyncDictReader(afp):
query = ScheduleService(
**{
"id": new["id"],
"name": new["name"],
"type": new["type"],
"title": new["title"],
"description": new["description"],
"date": datetime.strptime(
new["date"], settings.DATE
).date(),
"there_is": datetime.strptime(
new["there_is"], settings.DATETIME_FORMAT
),
"created_at": datetime.now(),
"sch_s_owner": new["sch_s_owner"],
"sch_s_service_id": new["sch_s_service_id"],
}
)
# ..
session.add_all(query)
session.refresh([query])
await session.commit()
self.mergte_result(..., load=False)
I get: AttributeError: '<MyModel>' object has no attribute '_sa_instance_state'
does anyone know if this is me making a mistake/wrong-assumption or is the tutorial that is missing something?
db.session.commit()
, without needing an active DB connection?
class MyModel(sqlalchemy.Model)
with type = db.Column(db.Text())
and I assign myModelObj.type = MyTypeEnum.MY_TYPE1
. Before I do db.session.commit()
, I see myModelObj.type == MyTypeEnum.MY_TYPE1
-- but after I do db.session.commit()
, a transformation has happened and myModelObj.type == "MY_TYPE1"
Hello SQLAlchemy community, I am hoping to get some guidance on the issue below:
the metadata catalog tool Open MetaData https://docs.open-metadata.org/openmetadata/connectors/database/db2 is using SQLAlchemy to connect to a Db2 LUW database. The below packages are installed :
apt-get install libxml2
pip install "ibm-db==3.1.3"
pip install "ibm-db-sa==0.3.8"
But when it tries to connect, it leads to error below:
Failed to test connection due to {"error": "Connection error from Engine(db2+ibm_db://dt224374:@db2databaseurl:50000/TPJ) - Unknown error connecting with Engine(db2+ibm_db://dt224374:*@db2databaseurl:50000/TPJ) - (ibm_db_dbi.ProgrammingError) ibm_db_dbi::ProgrammingError: SQLNumResultCols failed: [IBM][CLI Driver][DB2/AIX64] SQL0104N An unexpected token "END-OF-STATEMENT" was found following "1.3.1"} / SELECT 42". Expected tokens may include: "<table_expr>". SQLSTATE=42601 SQLCODE=-104\n[SQL: / {"app": "OpenMetadata", "version": "0.11.3.1"} /\nSELECT 42]\n(Background on this error at: https://sqlalche.me/e/14/f405)."}
It is pointing to the SQLAlchemy page https://sqlalche.me/e/14/f405 but this does not help in identifying what the issue could be.
If we use these packages ibm-db, ibm-db-sa directly in a Python script to connect to this DB2, we get the same error. It seems as though the generated query 'SELECT 42' here is wrong. DB2 expects that query to be 'SELECT 42 FROM SYSIBM.SYSDUMMY1;'
Hello all.
I met a runtime error with msg: "deque mutated during iteration" which comes from sqlalchemy/event/attr.py line 255
. (with sqlalchemy ver. 1.4.39)
The error shown since I added event listener on "before_commit" event as below:event.listen(my_scoped_session, "before_commit", my_func, propagate=True)
But I have no idea how to solve this.
Does anyone know the root cause?
This is my first time to leave msg here, it will be appreciated if anyone knows the root cause.
Hi folks, I'm facing a very high latency (200s/call) while doing INSERTs into a specific table in my DB. I did not face this issue while using Django ORM, but when I switched to SQLAlchemy ORM, it cropped up.
Here's the code doing the inserts:
class MyRepository():
def __init__(self):
_engine = create_engine(dburl, executemany_mode='batch', executemany_batch_page_size=10000, pool_size=20, max_overflow=10)
_session_maker = sessionmaker(bind=_engine)
self.session = _session_maker()
def bulk_create_records(self, objects): #max 50 objects passed
with self.session.begin():
self.session.bulk_save_objects(objects, return_defaults=True)
self.session.expunge_all()
return objects
Invocation: records = MyRepository().bulk_create_records(record_objects)
I'm connecting to a PostgreSQL RDS instance & the highest wait events with these inserts ("INSERT into mytable (cola, colb, ...) VALUES (?, ?, ...) RETURNING mytable.id;"
) are caused by SubtransControlLock
and subtrans
events.
What subtransactions are getting triggered here & how can I eliminate this issue?
Hi, I'm new to this sql package. I'm trying to connect to the sql server with the following link
connection_url = URL.create("mssql+pymssql://Username:Password@xx.103.xxx.xx/dbname")
engine = db.create_engine(connection_url)
connection = engine.connect()
and get this error
dialect, driver = name.split(".")
ValueError: too many values to unpack (expected 2)
make_url
function instead
session.bulk_save_objects(objects, return_defaults=True)
causes very high DB latency for the INSERT operation ....
session.bulk_save_objects(objects)
and then select the objects from DB using a hash key (which is indexed) but the select operations are also long running....
json_serializer
and json_deserializer
with the create_async_engine
function but the workflow is never going into those functions when i try to insert/query a json column of a model. Is it only for psycopg2?
File "/Users/thibaultvallois/etl-automation/service/etl_automation/persistors/base.py", line 32, in create_all
async with self._get_session() as session:
File "/Users/thibaultvallois/.asdf/installs/python/3.10.4/lib/python3.10/contextlib.py", line 206, in __aexit__
await anext(self.gen)
File "/Users/thibaultvallois/etl-automation/service/etl_automation/persistors/base.py", line 29, in _get_session
raise err
File "/Users/thibaultvallois/etl-automation/service/etl_automation/persistors/base.py", line 26, in _get_session
await session.commit()
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/ext/asyncio/session.py", line 580, in commit
return await greenlet_spawn(self.sync_session.commit)
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/util/_concurrency_py3k.py", line 128, in greenlet_spawn
result = context.switch(value)
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 1451, in commit
self._transaction.commit(_to_root=self.future)
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 829, in commit
self._prepare_impl()
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 808, in _prepare_impl
self.session.flush()
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3383, in flush
self._flush(objects)
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3522, in _flush
with util.safe_reraise():
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
compat.raise_(
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 208, in raise_
raise exception
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3483, in _flush
flush_context.execute()
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 456, in execute
rec.execute(self)
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 630, in execute
util.preloaded.orm_persistence.save_obj(
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 212, in save_obj
for (
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 398, in _organize_states_for_save
and instance_key in uowtransaction.session.identity_map
File "/Users/thibaultvallois/Library/Caches/pypoetry/virtualenvs/etl-automation-GVE63ZMH-py3.10/lib/python3.10/site-packages/sqlalchemy/orm/identity.py", line 100, in __contains__
if key in self._dict:
TypeError: unhashable type: 'dict'
```class Postings(Base):
stamp = Column(Date, primary_key=True)
dat = Column(JSON, primary_key=True)
postings = Postings(stamp=my_date, dat={"a": "b"})
await async_session.create(postings)
await async_session.commit()
```
DDL(raw_sql)
, or is there a way to leverage UniqueIndex
, as I likely have some fkeys pointing to these?
UniqueConstraint
session.bulk_save_objects(object_list, returning_defaults=True)
doesn't scale up to highly concurrent inserts - it causes some latency. Therefore i'm trying to implement the above where I generate the defaults, i.e. id
field, beforehand.id
field.
highly concurrent inserts
though