Closed
Description
Migrated issue, originally created by Adrian (@thiefmaster)
Using a CTE as a column property fails with:
CompileError: Multiple, unrelated CTEs found with the same name
Snippet to reproduce it:
from sqlalchemy import *
from sqlalchemy.dialects.postgresql import ARRAY, array
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import *
Base = declarative_base()
class Category(Base):
__tablename__ = 'categories'
id = Column(Integer, primary_key=True)
parent_id = Column(
Integer,
ForeignKey('categories.id'),
index=True,
nullable=True
)
is_deleted = Column(
Boolean,
nullable=False,
default=False
)
children = relationship(
'Category',
primaryjoin=(id == remote(parent_id)) & ~remote(is_deleted),
lazy=True,
backref=backref(
'parent',
primaryjoin=(remote(id) == parent_id),
lazy=True
)
)
@staticmethod
def _create_column_prop():
cat_alias = aliased(Category)
cte_query = (select([cat_alias.id, cast(array([]), ARRAY(Integer)).label('parents')])
.where(cat_alias.parent_id.is_(None) & ~cat_alias.is_deleted)
.cte(recursive=True))
parent_query = (select([cat_alias.id, cte_query.c.parents.op('||')(cat_alias.parent_id)])
.where((cat_alias.parent_id == cte_query.c.id) & ~cat_alias.is_deleted))
cte_query = cte_query.union_all(parent_query)
query = select([func.count()]).where(cte_query.c.parents.contains(array([Category.id])))
Category.deep_children_count = column_property(query, deferred=True)
Category._create_column_prop()
e = create_engine('postgresql:///test', echo=True)
Base.metadata.create_all(e)
s = Session(e)
root = Category(id=0, children=[
Category(id=1),
Category(id=2, children=[
Category(id=3),
Category(id=4),
Category(id=5, children=[
Category(id=6),
Category(id=7)
])
])
])
s.add(root)
s.flush()
s.expire_all()
print
cat = s.query(Category).get(2)
print cat
print cat.deep_children_count # should be 5
Related: https://groups.google.com/forum/#!topic/sqlalchemy/TvAg3hIoKps
Metadata
Metadata
Assignees
Labels
Type
Projects
Relationships
Development
No branches or pull requests
Activity
sqlalchemy-bot commentedon Jun 10, 2016
Adrian (@thiefmaster) wrote:
This workaround (wrapping the
column_property()
call in that contextmanager) avoids the issue. Do you see any cases where this would be likely to cause breakage?sqlalchemy-bot commentedon Jun 10, 2016
Michael Bayer (@zzzeek) wrote:
it will work as long as no other concurrent threads are setting up mappers at the same time.
sqlalchemy-bot commentedon Jun 10, 2016
Michael Bayer (@zzzeek) wrote:
also the deannotate does have a purpose, im not sure if it matters for the CTE here I'd have to review the cases where this deannotation is necessary.
sqlalchemy-bot commentedon Jun 10, 2016
Michael Bayer (@zzzeek) wrote:
https://gerrit.sqlalchemy.org/#/q/I9aeac9cd24d8f7ae6b70e52650d61f7c96cb6d7e
sqlalchemy-bot commentedon Jun 10, 2016
Michael Bayer (@zzzeek) wrote:
Ensure CTE internals are handled during clone
The CTE construct was missing a _copy_internals() method
which would handle CTE-specific structures including _cte_alias,
_restates during a clone operation.
Change-Id: I9aeac9cd24d8f7ae6b70e52650d61f7c96cb6d7e
Fixes: #3722
(cherry picked from commit 7189d0b)
→ 33e2a02
sqlalchemy-bot commentedon Jun 10, 2016
Changes by Michael Bayer (@zzzeek):
sqlalchemy-bot commentedon Jun 10, 2016
Michael Bayer (@zzzeek) wrote:
Ensure CTE internals are handled during clone
The CTE construct was missing a _copy_internals() method
which would handle CTE-specific structures including _cte_alias,
_restates during a clone operation.
Change-Id: I9aeac9cd24d8f7ae6b70e52650d61f7c96cb6d7e
Fixes: #3722
→ 7189d0b
sqlalchemy-bot commentedon Jun 10, 2016
Changes by Michael Bayer (@zzzeek):
sqlalchemy-bot commentedon Jun 13, 2016
Adrian (@thiefmaster) wrote:
Thanks for the quick fix! Do you have any ETA planned for 1.0.14?
sqlalchemy-bot commentedon Jun 13, 2016
Michael Bayer (@zzzeek) wrote:
1.0.13 was...about a month ago, so figure in how long it takes new seeds of wheat to be harvestable in Minecraft plus how many issues are in the release...it's a little soon but I could cut a release this week sometime I suppose