-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to interpret stdout as utf8? #83
Comments
Fwiw, I'm currently using the following ugly non-reentrant hack as a workaround: withUtf8 :: Sh a -> Sh a
withUtf8 act = do
oldloc <- liftIO getLocaleEncoding
if (textEncodingName oldloc == textEncodingName utf8)
then act
else do
liftIO $ setLocaleEncoding utf8
r <- act
liftIO $ setLocaleEncoding oldloc
return r |
I have never seen that predicament. I always change the system locale to something like At the lower level Shelly is reading from a |
You may be able to just use |
I've got a similar problem with the Darcs test suite. It contains a couple of scripts with different encodings to test how Darcs handles that, and they fail when run via Shelly because the output isn't valid UTF8 (I think). Hacking Shelly to do either If you could add a hook to allow clients to do that properly that'd be great. Happy to write the code myself if you can provide some guidance on how you'd like the hook to look. |
@hsenag I believe you are reporting a different issue. This issue is about setting the locale for utf8 data. Your issue is that you want to work with binary data. Lets make a separate github issue for that. |
If I run something like
but the environment has a non-utf8 locale set such as
LANG=C
then I getWhat's the recommended way to run a command which is known to always output utf8 to its stdout/stderr, regardless of any
LANG
-setting?The text was updated successfully, but these errors were encountered: