You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, new user! thanks for the making this, very excited to use it. I'm yet to get chat to work and wonder if it's because of these settings. Which should I be using? If makes a difference, I'm a smartconnect subscriber and keen to keep my notes off the cloud.
The text was updated successfully, but these errors were encountered:
Smart Connect currently works with ChatGPT through Custom GPTs. Though, integration with the Smart Chat (what's available in the Obsidian plugin) is a WIP.
Re: your question: the Custom Local option allows you to configure a local model endpoint, but requires that you already have a local model running (advanced and out of the scope of what I can help with unfortunately). Though, as long as you're using a local embedding model (which is the default) only notes that are relevant to your current conversation will be sent to the AI platform servers.
Local embedding model is the 80-20 for privacy concerns. That means, even if you use a cloud provider like Anthropic/OpenAI for chat model (my personal preference), local embedding significantly decreases the surface area of what is exposed to the cloud since the embedding processes all of your notes (not just what is relevant to the conversation).
Lastly, if you were confused about what Smart Connect Official Service was for, you can reply to the welcome email and I'll make it right 🌴
Hi, new user! thanks for the making this, very excited to use it. I'm yet to get chat to work and wonder if it's because of these settings. Which should I be using? If makes a difference, I'm a smartconnect subscriber and keen to keep my notes off the cloud.
The text was updated successfully, but these errors were encountered: