-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
KNNClassifier #330
KNNClassifier #330
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems fine to me and all my suggestions are nit picks.
We could loop around in summer to make a visualisation of how this one works.
|
||
KNNClassifier is a supervised machine learning algorithm for classifying data points to learned categories. It uses an internal [KDTree](/reference/kdtree) to find the _k_ nearest neighbours of a point that needs classification (where _k_ is an integer >= 1). Whichever category, or "class", is most common among the neighbours is predicted as the category for that point. If an even number of `numNeighbours` is requested and there is a tie, the label with the closer point will be predicted. The parameter `weight` indicates whether or not the prediction should be weighted by the neighbours' distances. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
k nearest should be k-nearest
|
||
FluCoMa includes another object for classification, the [MLPClassifier](/reference/mlpclassifier), which also uses supervised learning for classification. The KNN object works quite differently from the MLP object, each having their strengths and weaknesses. The main differences to know are that: | ||
|
||
1. the flexibility of the MLP objects make them generally more capable of learning complex relationships between inputs and outputs, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would capitalise "The" in each instance
FluCoMa includes another object for classification, the [MLPClassifier](/reference/mlpclassifier), which also uses supervised learning for classification. The KNN object works quite differently from the MLP object, each having their strengths and weaknesses. The main differences to know are that: | ||
|
||
1. the flexibility of the MLP objects make them generally more capable of learning complex relationships between inputs and outputs, | ||
2. the MLP objects involve more parameters and will take much longer to `fit` (aka. train) than the KNN objects, and |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would remove, and
sounds good. while making this i imagined it but didn't go to implement. I think it could be a not to complicated extension of the kdtree gui. |
That's what I imagined as well. I'll make an issue and we can loop when we are less pressed? |
No description provided.