-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug in expanding cluster? #52
Comments
@jasonbaldridge @gabeos please could you take a look into this? Any doubt don't hesitate to ask please. |
It's been a very long time since I wrote this. If you could provide a failing test for this to verify your assumption then accepting a PR would be a lot easier. |
hey @carlos-verdes ! I know this is quite old, but... did you manage to verify that issue? I'm running into another issue now and it might be related. At some point, the append neighbourhood ++ newNeighbours throws
Scala: 2.11 For what I could tell at looking at the Vector code it is running out of 'levels' Any clues?? @muuki88 Thanks! |
Thanks for reporting this @btafel 😄 . Unfortunately I can't help in any regards as I haven't used this code for 7 years now. Also I mostly work on Scala 2.13 projects. |
lol it makes sense! I know it is old - very old - but I have to say that it works :) at least in most cases. I was inspired by this article https://www.oreilly.com/content/clustering-geolocated-data-using-spark-and-dbscan/ and got it running smoothly. I'm also trying another approach using UserDefinedAggregateFunction and Smile |
Wow! I'm really happy that this is still in use and useful to others 😍 |
Hi, in GDBScan, the last line of the expand method says "if new point is a cluster"... but in the "if clause" the point is evaluated with the current neighbourhood... not with the new neighbours.
I think the last line should be:
If you agree I can do a PR.
The text was updated successfully, but these errors were encountered: