Skip to content
This repository has been archived by the owner on Dec 26, 2022. It is now read-only.

✨ Adding message history #380

Merged
merged 6 commits into from
Jan 13, 2022
Merged

Conversation

Sigmanificient
Copy link
Member

c = await self.get_channel(...)

async for m in c.history(limit=1000):
    # To your stuff

@codecov
Copy link

codecov bot commented Jan 10, 2022

Codecov Report

Merging #380 (83f1d02) into main (9db7e1b) will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##             main     #380   +/-   ##
=======================================
  Coverage   90.54%   90.54%           
=======================================
  Files           8        8           
  Lines          74       74           
=======================================
  Hits           67       67           
  Misses          7        7           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9db7e1b...83f1d02. Read the comment docs.

Comment on lines +820 to +821
while limit > 0:
retrieve = min(limit, 100)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

limit HAS to be capped to 100. If they choose a number too big there will be rate limits and that will cause it to stall because of an "invisible error"

Copy link
Member Author

@Sigmanificient Sigmanificient Jan 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't agree, we need to provide an user-friendly way to get all messages within a channel, even if it is extremely slow.
Discord.py do it has well, see channel.history(limit=...).flatten()

Also it deal very nicely with ratelimiting.
I test it with 4000 messages, in 41.18647676300316 seconds
For 8000 messages, it took 84.87266815700423 seconds

Copy link
Member

@Lunarmagpie Lunarmagpie Jan 11, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see your point but you need to change things
There can be a lot of api calls so you should use gather. You can even use itertools.chain around gather. I bet that would be useful here.

Comment on lines +820 to +821
while limit > 0:
retrieve = min(limit, 100)
Copy link
Member

@Lunarmagpie Lunarmagpie Jan 11, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see your point but you need to change things
There can be a lot of api calls so you should use gather. You can even use itertools.chain around gather. I bet that would be useful here.

if not raw_messages:
break

for _message in raw_messages[:-1]:
Copy link
Member

@Lunarmagpie Lunarmagpie Jan 11, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry on mobile

before = raw_messages[-1].id
for message in raw_messages:
    yield UserMessage.from_dict(message)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

raw_messages is not yet converted payload, so you cant access .id, i guess we can do ["id"] tho

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that would be bettr

pincer/objects/guild/channel.py Outdated Show resolved Hide resolved
pincer/objects/guild/channel.py Outdated Show resolved Hide resolved
pincer/objects/guild/channel.py Outdated Show resolved Hide resolved
@Sigmanificient Sigmanificient merged commit 5ca1ebd into Pincer-org:main Jan 13, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants