-
-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problems related to exercise evolution #157
Comments
I had thought we could push the sha of the current HEAD of git as one of the params on the @kytrinyx Does this make sense or do I have assumptions I don't know I have? :) |
@iHiD there are some assumptions—let me write this all out so we have a better basis for discussing! |
@iHiD Here are the problems that I see:
|
@kytrinyx Thanks. FYI, I think there are three seperate questions here:
I feel comfortable with working out 3 as we go along if we get 1+2 right. Again, there may be other things I don't know about. |
Sorry, forgot to answer this part: No, the CLI doesn't use git. The CLI calls up the API, which uses trackler to build the implementation based on files from x-common and the track. Trackler is a library (gem) which gets updated fairly regularly with new content. It's updated as a dependency in the app and the api. This means that what is deployed on the site doesn't necessarily match what is most recent on GitHub. |
What do you mean about the bits uncommented? Is that the skipped/pending etc thing that some tracks do? |
For clarity, I'm talking about the sha of the git commits, not a sha of the content of the files. I'm not sure if that made sense in what I said before.
Yes. That's what I mean. I specifically mean, should we save the tests people ran, or the tests people were given.
For my clarity, Trackler pieces together the readme, tests and template and sends it to the user's machine. Is that correct? |
I think this is a separate question than what we store. We can have code that knows how to serve up the same test suite either in the "todo" state, or the "solved" state. I think that we should always deliver the "solved" version of the test suite when people download, restore, or look at the test suite on the website. We should only deliver the "todo" version of the test suite when people are actually fetching the exercise to work on it.
When the API receives a |
@kytrinyx OK, so how about this workflow:
That's a trivial implementation API/web-side, and I imagine it's only a small change to the CLI. What have I missed this time? :) |
This would work. I'm going to discuss some drawbacks afterwards, but for now I'm going to rewrite your workflow the way I understand it, given (essentially) no changes in our current setup. In other words: tracks would store the data in the same way, the problem specifications would be what they are now, Trackler would work in the same way, we would publish new versions of Trackler, and update the trackler dependency as we do now, and the CLI would do the same thing. OK. :)
This would work with absolutely no changes to the CLI. An optimization: the UUID on S3 could be a hash of the generated exercise, which means that we would only store exercises when they've changed. Potential concern: we'd still have to generate the exercise each time in order to generate the hash. |
Another option is to store a list of implementations with uuids in the database, and only update these when the implementations change. Then when someone fetches, we skip the generate step, and go straight to get the most recent implementation. |
Agree with what you said. In the nextercism prototype, we create a db entry when someone unlocks/starts an exercise. We could generate it all at that stage and then it's locked for that set of iterations. If you're happy with this general approach I suggest we move/defer this to a reboot implementation discussion. (if :)) |
OK, let's say that the underlying mechanism is the following:
There are still open questions and concerns that I'd like to raise. In particular, the current solution with Trackler is problematic for a number of reasons.
So, we could continue to use trackler, and either update all implementations on the fly as described above, or update everything when the version of trackler changes. Or, we could take a different approach, not deploy the files that implementations are generated from, but rather use an external tool to update implementations via an API. This would mean that changes would be reflected on We would not have to redeploy the site to get fresh exercises. We would get less noise in the app history, since we'd no longer have to commit the new version of the dependency to the Gemfile.lock file. Deploys would be faster (but we would have to have the tool to submit stuff running somewhere. That might be manual, or it could be webhooks, or it could be a cron job). |
Now that I think about it Trackler vs some other tool is actually irrelevant to the fetching and downloading and restoring of exercises. The actual workflow still solves all of the problems listed, and we can tweak how things end up in the datastore later. |
Given the above, we have
For (1) I'm happy with the workflow that is proposed. For (2) I want to think about it more. For (3) it raises the question of whether the CLI should submit the tests along with their solution. Then we would store their exact file—including any changes to the test suite that they made, additional tests written, comments, etc. I don't know if this would add significant value for the complexity that it adds. |
Another thought, if we decide to go with (1), then given the way things are currently set up, we could even implement an early version of it that stores things in postgresql instead of S3, and trivially switch to S3 when we're ready to. |
TL;DR: This is not ideal, but it's a step in a slightly better direction, taken in a way that I think will cause the least amount of pain in terms of people using the CLI. The most important thing that this will enable, is a fairly straight-forward change in the data model that will let us get completely rid of the "version" tests that some tracks use in their exercises, where they have a test in the test suite that specifies a version, and the person solving it needs to add a hard-coded value in their solution to match that version. For more about this problem and how we envision solving it, see the discussion in exercism/discussions#157 --- To make this change, I checked which x-api endpoints the CLI actually talks to, and then copied these into a new Sinatra app within this repository. I put it inside of the 'api' directory, but I'm mounting it alongside it. I also ported over the tests relevant to these endpoints. I had to make a dirty hack around some stuff in the approvals library. Long story short, I wrote approvals as an experiment, made some pretty bad design choices, but the library has been helpful so I still use it. In this case I'm working around a singleton that never should have seen the light of day. This works with the existing CLI with the following change: exercism configure --api=http://exercism.io/xapi Or, to test locally: exercism configure --host=http://localhost:4567 --api=http://localhost:4567/xapi --- In order to make this change live, I think we could: 1. create an FYI issue in discussions explaining the change 1. update all the documentation about configuring the CLI to point to use the new URL 1. create a dismissable alert that we can put on everyone's dashboard as well as on their notifications page (if they dismiss it in either place it will disappear from both) 1. after a couple of weeks, update the endpoints in x-api to report an error (which will be displayed by the CLI) that points to the FYI issue, and shows the command to update the configuration. --- This whole API thing has a long and storied history. In the earliest days of Exercism, when I didn't realize it would turn into a real thing, I had all the problem specifications and exercise implementations living within exercism.io. Then we started getting more tracks, and it was really painful to couple the changes in the exercises to the changes in the website. I tried splitting out all the tracks to their own repositories (a good choice) and splitting out the API that delivered those exrecises into its own app (somewhat less of a good choice). This became painful because it turns out I hadn't found the correct seams, and needed cross-talk between the two APIs. Extracting all of the track-related stuff into the Trackler library was a good choice, which cleaned up some painful cross-talk, but didn't eliminate it. It also has some deploy-related pains (but those are unrelated to this pull request). Anyway—by moving the x-api back into the main website app, we get rid of the final bit of cross talk. Eventually I want to rewrite the entire API from scratch taking into account all the things we've learned in the past four years. I think we have a much better idea of which abstractions we have and what the API needs to do.
I probably have some opinions about all this, I will need a few days to find the time to properly read through and add my comments. |
Very useful! I think this would best reflect what the user expects. I for myself would at least expect the same result when I fetch an exercise; most users might not even be aware of the fact that exercises can change. However, we should display a warning if there is an updated version available of that exercise, to allow the user to update to the latest version when he/she chooses to. In the warning message, we should list how the user can update (i.e. which CLI command to use).
I like this a lot. Being able to link what is submitted to the tests the code was written against is very useful, especially when a track/exercise changes regularly. |
I think that makes sense. |
TL;DR: This is not ideal, but it's a step in a slightly better direction, taken in a way that I think will cause the least amount of pain in terms of people using the CLI. The most important thing that this will enable, is a fairly straight-forward change in the data model that will let us get completely rid of the "version" tests that some tracks use in their exercises, where they have a test in the test suite that specifies a version, and the person solving it needs to add a hard-coded value in their solution to match that version. For more about this problem and how we envision solving it, see the discussion in exercism/discussions#157 --- To make this change, I checked which x-api endpoints the CLI actually talks to, and then copied these into a new Sinatra app within this repository. I put it inside of the 'api' directory, but I'm mounting it alongside it. I also ported over the tests relevant to these endpoints. I had to make a dirty hack around some stuff in the approvals library. Long story short, I wrote approvals as an experiment, made some pretty bad design choices, but the library has been helpful so I still use it. In this case I'm working around a singleton that never should have seen the light of day. This works with the existing CLI with the following change: exercism configure --api=http://exercism.io/xapi Or, to test locally: exercism configure --host=http://localhost:4567 --api=http://localhost:4567/xapi --- In order to make this change live, I think we could: 1. create an FYI issue in discussions explaining the change 1. update all the documentation about configuring the CLI to point to use the new URL 1. create a dismissable alert that we can put on everyone's dashboard as well as on their notifications page (if they dismiss it in either place it will disappear from both) 1. after a couple of weeks, update the endpoints in x-api to report an error (which will be displayed by the CLI) that points to the FYI issue, and shows the command to update the configuration. --- This whole API thing has a long and storied history. In the earliest days of Exercism, when I didn't realize it would turn into a real thing, I had all the problem specifications and exercise implementations living within exercism.io. Then we started getting more tracks, and it was really painful to couple the changes in the exercises to the changes in the website. I tried splitting out all the tracks to their own repositories (a good choice) and splitting out the API that delivered those exrecises into its own app (somewhat less of a good choice). This became painful because it turns out I hadn't found the correct seams, and needed cross-talk between the two APIs. Extracting all of the track-related stuff into the Trackler library was a good choice, which cleaned up some painful cross-talk, but didn't eliminate it. It also has some deploy-related pains (but those are unrelated to this pull request). Anyway—by moving the x-api back into the main website app, we get rid of the final bit of cross talk. Eventually I want to rewrite the entire API from scratch taking into account all the things we've learned in the past four years. I think we have a much better idea of which abstractions we have and what the API needs to do.
TL;DR: This is not ideal, but it's a step in a slightly better direction, taken in a way that I think will cause the least amount of pain in terms of people using the CLI. The most important thing that this will enable, is a fairly straight-forward change in the data model that will let us get completely rid of the "version" tests that some tracks use in their exercises, where they have a test in the test suite that specifies a version, and the person solving it needs to add a hard-coded value in their solution to match that version. For more about this problem and how we envision solving it, see the discussion in exercism/discussions#157 --- To make this change, I checked which x-api endpoints the CLI actually talks to, and then copied these into a new Sinatra app within this repository. I put it inside of the 'api' directory, but I'm mounting it alongside it. I also ported over the tests relevant to these endpoints. I had to make a dirty hack around some stuff in the approvals library. Long story short, I wrote approvals as an experiment, made some pretty bad design choices, but the library has been helpful so I still use it. In this case I'm working around a singleton that never should have seen the light of day. This works with the existing CLI with the following change: exercism configure --api=http://exercism.io/xapi Or, to test locally: exercism configure --host=http://localhost:4567 --api=http://localhost:4567/xapi --- In order to make this change live, I think we could: 1. create an FYI issue in discussions explaining the change 1. update all the documentation about configuring the CLI to point to use the new URL 1. create a dismissable alert that we can put on everyone's dashboard as well as on their notifications page (if they dismiss it in either place it will disappear from both) 1. after a couple of weeks, update the endpoints in x-api to report an error (which will be displayed by the CLI) that points to the FYI issue, and shows the command to update the configuration. --- This whole API thing has a long and storied history. In the earliest days of Exercism, when I didn't realize it would turn into a real thing, I had all the problem specifications and exercise implementations living within exercism.io. Then we started getting more tracks, and it was really painful to couple the changes in the exercises to the changes in the website. I tried splitting out all the tracks to their own repositories (a good choice) and splitting out the API that delivered those exrecises into its own app (somewhat less of a good choice). This became painful because it turns out I hadn't found the correct seams, and needed cross-talk between the two APIs. Extracting all of the track-related stuff into the Trackler library was a good choice, which cleaned up some painful cross-talk, but didn't eliminate it. It also has some deploy-related pains (but those are unrelated to this pull request). Anyway—by moving the x-api back into the main website app, we get rid of the final bit of cross talk. Eventually I want to rewrite the entire API from scratch taking into account all the things we've learned in the past four years. I think we have a much better idea of which abstractions we have and what the API needs to do.
TL;DR: This is not ideal, but it's a step in a slightly better direction, taken in a way that I think will cause the least amount of pain in terms of people using the CLI. The most important thing that this will enable, is a fairly straight-forward change in the data model that will let us get completely rid of the "version" tests that some tracks use in their exercises, where they have a test in the test suite that specifies a version, and the person solving it needs to add a hard-coded value in their solution to match that version. For more about this problem and how we envision solving it, see the discussion in exercism/discussions#157 --- To make this change, I checked which x-api endpoints the CLI actually talks to, and then copied these into a new Sinatra app within this repository. I put it inside of the 'api' directory, but I'm mounting it alongside it. I also ported over the tests relevant to these endpoints. I had to make a dirty hack around some stuff in the approvals library. Long story short, I wrote approvals as an experiment, made some pretty bad design choices, but the library has been helpful so I still use it. In this case I'm working around a singleton that never should have seen the light of day. This works with the existing CLI with the following change: exercism configure --api=http://exercism.io/xapi Or, to test locally: exercism configure --host=http://localhost:4567 --api=http://localhost:4567/xapi --- In order to make this change live, I think we could: 1. create an FYI issue in discussions explaining the change 1. update all the documentation about configuring the CLI to point to use the new URL 1. create a dismissable alert that we can put on everyone's dashboard as well as on their notifications page (if they dismiss it in either place it will disappear from both) 1. after a couple of weeks, update the endpoints in x-api to report an error (which will be displayed by the CLI) that points to the FYI issue, and shows the command to update the configuration. --- This whole API thing has a long and storied history. In the earliest days of Exercism, when I didn't realize it would turn into a real thing, I had all the problem specifications and exercise implementations living within exercism.io. Then we started getting more tracks, and it was really painful to couple the changes in the exercises to the changes in the website. I tried splitting out all the tracks to their own repositories (a good choice) and splitting out the API that delivered those exrecises into its own app (somewhat less of a good choice). This became painful because it turns out I hadn't found the correct seams, and needed cross-talk between the two APIs. Extracting all of the track-related stuff into the Trackler library was a good choice, which cleaned up some painful cross-talk, but didn't eliminate it. It also has some deploy-related pains (but those are unrelated to this pull request). Anyway—by moving the x-api back into the main website app, we get rid of the final bit of cross talk. Eventually I want to rewrite the entire API from scratch taking into account all the things we've learned in the past four years. I think we have a much better idea of which abstractions we have and what the API needs to do.
What is the current status of this? Reading through all the above I was concerned that this was making things terribly complicated, and the solution I was going to propose was moving the Readme into the tracks. This seems to have been already suggested here: #158 |
This is dead and superseded by #158 |
When the test suite of an exercise changes we run into three potential problems:
These are all symptoms of the same problem, I think:
It is currently non-trivial (or perhaps impossible) to know exactly which version of a test suite someone wrote their solution against.
Some tracks have implemented exercise versioning by adding a version value and a test that checks it. That kind of fixes the first problem: reviewers can see that this solution was written against a particular version of the tests.
It does not solve either of the problems with downloading or restoring solutions.
It also introduces a problem: this version has nothing to do with learning the language or doing the exercises—it's a workaround that puts a burden on the learner.
I would like to fix this in a way that solves all three of our problems and puts no burden on the people solving the exercise.
I think that this means that we need to in some way store exactly the exercise that the person fetched so that we can refer to it later. While the biggest problem is the test suite, I think that it is worth also storing the exact README that they received as well, since instructions can change and that could confuse matters in subtle ways.
How would you suggest solving this? Don't worry too much about how the site is currently doing things. We can change that. We won't be able to do anything about existing solutions, but we will be able to fix it for all solutions going forward.
The text was updated successfully, but these errors were encountered: