Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Do not merge] Feat/chat moderation with tensorflow js #17

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

lazeratops
Copy link
Contributor

Ran into some errors running this branch, so making a temp draft PR to dig in and leave comments if needed.

@netlify
Copy link

netlify bot commented May 2, 2023

Deploy Preview for vue-call-object ready!

Name Link
🔨 Latest commit 9dc764e
🔍 Latest deploy log https://app.netlify.com/sites/vue-call-object/deploys/6452711e250fc4000894e0df
😎 Deploy Preview https://deploy-preview-17--vue-call-object.netlify.app/
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

beforeCreate() {
const threshold = 0.9;
load(threshold).then(model => {
this.model = Object.freeze(model);
Copy link
Contributor Author

@lazeratops lazeratops May 2, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@adamichelle - when I run this locally, this promise seems to take a very long time to resolve (to the point where I thought it wasn't resolving at all at first). But the rest of the app loads, and chat becomes available, which results in an exception down the line because the model is never actually initialized when we try to use it.

I believe we should make the followup logic contingent on the model existing (e.g., disabling chat completely until the model loads, and maybe even starting the load process earlier?). What do you think?

Another thing I noticed is that this seems to run twice.

Copy link

@adamichelle adamichelle May 2, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't reproduce this issue so I am not sure how to go about this.
The model is always present even before the loading spinner disappears in my local dev environment.
As for the reason why it is running twice, I don't think it has anything to do with the model. I tried adding a beforeCreate hook on the main branch and it is running twice. Maybe it has something to do with how the components are setup?

double-beforeCreate-hook-call double-beforeCreate-hook-call-code

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also don't think we can start the loading process any time earlier than in the beforeCreate hook. When I was building the demo, I considered whether to add it before there or in the beforeMount hook and decided on beforeCreate since it is the earliest hook in the lifecycle of the ChatTile component.
Unless you want to move it higher in the component hierarchy which I don't think is ideal because I believe it makes more sense to keep all chat input management related logic within the ChatTile component.

Copy link
Contributor Author

@lazeratops lazeratops May 3, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, let's leave the double-call as is!

Re the exception and model load: I think it makes sense that this issue is intermittent, since it seems like a timing issue with how fast the model load takes place. For example, locally I get this exception below - the model does not load until about 20-30 seconds into the call, and if I try to send a chat message the logic uses a model that is still null:
image

Which makes sense, since from what I understand we don't actually await for the model load to finish anywhere before proceeding. I've got some ideas to possibly mitigate this, will file a PR against this branch for your review in a bit!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants