Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: Add support for "#! /usr/bin/env cabal #3843

Closed
hvr opened this issue Sep 15, 2016 · 6 comments · Fixed by #5483
Closed

RFC: Add support for "#! /usr/bin/env cabal #3843

hvr opened this issue Sep 15, 2016 · 6 comments · Fixed by #5483

Comments

@hvr
Copy link
Member

hvr commented Sep 15, 2016

When I start hacking a new program, I often start with a single module, and maybe lazily upgrade to a proper full .cabal+cabal.project project when a single module becomes too inconvenient. Or not at all. Also, a single small .hs file is easier to share/move around, or post on blogposts than a project requiring multiple files.

One common issue with one-module runghc scripts is that we can't easily associate dependency information with them, as they implicitly reference the global/user-pkg dbs. But in the light of new-build, it would be quite convenient to have one-module scripts carry a subset of build-info meta-data with them so that cabal could bring the necessary packages into scope for that script before invoking runghc.

Therefore I'd suggest to add support for this workflow to cabal; Here's a simple example of what I envision:

#! /usr/bin/env cabal
{- cabal:
index-state: 2016-09-15T12:00:00Z
with-compiler: ghc-8.0.1
build-depends: base ^>= 4.9
             , shelly ^>= 1.6.8
-}

main :: IO ()
main = do
   ...

One design question is whether such a script should auto-install the required dependencies if they're not already cached (and instead abort, telling the user which command to execute in order to populate the cache).

Another is whether cabal should compile the script into an executable, store that some nix-store-like cache (indexed by a computed hash), and run it.

@dcoutts
Copy link
Contributor

dcoutts commented Sep 15, 2016

Quite plausible. Just needs a clear design.

@BardurArantsson
Copy link
Collaborator

BardurArantsson commented Sep 15, 2016

One design question is whether such a script should auto-install the required dependencies if they're not already cached (and instead abort, telling the user which command to execute in order to populate the cache).

I'm not sure I see why it shouldn't just auto-install (nix-style) the dependencies. Surely that's basically the purpose of this feature in the first place... to Just Do It(TM)?

EDIT: Same thing the auto-run bit. (I suppose it doesn't have to "compile" it, but could use the interpreter, but if we're talking small scripts then I don't suppose there's not particularly much to be gained by not compiling?)

@hvr
Copy link
Member Author

hvr commented Sep 15, 2016

Btw, another thing to consider is that we don't want to have to rerun the expensive solver on each invocation when there is no new install-plan to be expected (e.g. when the .hs file hasn't changed in the case when we specificy an index-state; and when we don't specify an index-state, when neither .hs file nor 01-index.tar has changed since the last execution).

@ezyang
Copy link
Contributor

ezyang commented Sep 15, 2016

I'm not sure I see why it shouldn't just auto-install (nix-style) the dependencies. Surely that's basically the purpose of this feature in the first place... to Just Do It(TM)?

I think there is one good argument against doing it automatically, which it can mask performance problems. In abstract, suppose for whatever reason you expect all the deps to have already been installed. If you run the script and it goes off and starts building things, a performance invariant has been violated. If the script doesn't error, its easy to not notice, and now you're paying (a nontrivial) perf tax to run your script.

It's all about expectations. It's unpleasant to run stack and then suddenly a GHC binary starts downloading to your machine. Same thing here: you wanted to run a script, not start a multi-minute build process to get it going.

@ezyang ezyang added this to the milestone Sep 15, 2016
@ezyang
Copy link
Contributor

ezyang commented Sep 15, 2016

@hvr, I milestoned this as bottom. When there's a full design OR someone has prototyped it, we can remilestone it.

@dcoutts
Copy link
Contributor

dcoutts commented Sep 19, 2016

Re the perf expectations question, I think this is a case where a flag may make sense. Seems to me the default ought to be to build packages (but not download ghc), and have flags to modify both behaviours, ie assume that deps are built already so fail if not, or if in future we support installing big things like compilers then also doing that if needed.

So just like we have fetch and --offline to control network access, and we have build --only-dependencies but no corresponding "assume/require deps be installed first or fail" flag. Adding that would solve the problem ezyang it talking about (though I think the default should be the other way around vs what ezyang proposes).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants