-
Notifications
You must be signed in to change notification settings - Fork 308
Querying current_communities takes over 50ms in production #1495
Comments
@mw44118 I see you have contributed to db optimizations in the past, do you have any ideas? |
I'm looking into it now. Sometimes queries just take a while. >50ms is On Mon, Sep 23, 2013 at 5:23 PM, Zbyněk Winkler [email protected]:
W. Matthew Wilson |
First, add an index on the is_member column on the communities table. Second, one thing -- do you really need ALL the data? Right now, you're pulling 610 rows back. Are you displaying all of it on the same screen? Maybe an offset and a limit would help. Pulling 610 rows of data back from the database costs time. |
Also, add an index on is_suspicious on the participants table. |
I've tried to add indexes: CREATE INDEX participants_is_suspicious
ON participants
USING btree
(is_suspicious );
CREATE INDEX communities_is_member
ON communities
USING btree
(is_member ); However the EXPLAIN has not changed. It still does full scan both times. Could it be that I am testing it locally with almost empty db? I guess I'll just leave it be until we get #1502. |
Do you get a >50ms query time on your local box? |
It could be helpful to modify fake_data.py to generate a lot of fake data if asked to. Reticket if you like. |
Dropping from Infrastructure, per #1417 (comment). |
What page does this hit on? /for/? |
Closing in favor of #1549. |
Queries of this kind are the only ones at this time that take over 50ms (up to 250ms):
It is from community.py. See the production EXPLAIN ANALYZE. There are two sequential scans - participants and communities.
The text was updated successfully, but these errors were encountered: