Reputation: 12283
I want to invoke a process from within a haskell program and capture stdout as well as stderr.
What I do:
(_, stdout, stderr) <- readProcessWithExitCode "command" [] ""
The problem: This way, stdout and stderr are captured separately, however I want the messages to appear in the right place (otherwise I would simply stdout ++ stderr
which separates error messages from their stdout counterparts).
I do know that I could achieve this if I'd pipe the output into a file, i.e.
tmp <- openFile "temp.file" ...
createProcess (proc "command" []) { stdout = UseHandle tmp,
stderr = UseHandle tmp }
So my current workaround is to pipe outputs to a tempfile and read it back in. However I'm looking for a more direct approach.
If I was on unix for sure I'd simply invoke a shell command á la
command 2>&1
and that's it. However, I'd like to have this as portable as possible.
What I need this for: I've built a tiny haskell cgi script (just to play with it) which invokes a certain program and prints the output. I want to html-escape the output, thus I can't simply pipe it to stdout.
I was thinking: Maybe it's possible to create an in-memory-handle, like a PipedInputStream/PipedOutputStream in Java, or ArrayInputStream/ArrayOutputStream which allows for processing IO streams within memory. I looked around for a function :: Handle
on hoogle, but did not find anything.
Maybe there is another Haskell module out there which allows me to merge two streams?
Upvotes: 11
Views: 1544
Reputation: 633
For posix system you can use createPipe
and fdToHandle
in System.Posix.IO
to create a pair of new handles (I'm not sure where to close those handles and fds though..):
readProcessWithMergedOutput :: String -> IO (ExitCode, String)
readProcessWithMergedOutput command = do
(p_r, p_w) <- createPipe
h_r <- fdToHandle p_r
h_w <- fdToHandle p_w
(_, _, _, h_proc) <- createProcess (proc command [])
{ std_out = UseHandle h_w
, std_err = UseHandle h_w
}
ret_code <- waitForProcess h_proc
content <- hGetContents h_r
return (ret_code, content)
For windows, this post implemented a cross-platform createPipe
.
Upvotes: 5
Reputation: 35089
You can use pipes
to concurrently merge two input streams. The first trick is to read from two streams concurrently, which you can do using the stm
package:
import Control.Applicative
import Control.Proxy
import Control.Concurrent
import Control.Concurrent.STM
import System.Process
toTMVarC :: (Proxy p) => TMVar a -> () -> Consumer p a IO r
toTMVarC tmvar () = runIdentityP $ forever $ do
a <- request ()
lift $ atomically $ putTMVar tmvar a
fromTMVarS :: (Proxy p) => TMVar a -> () -> Producer p a IO r
fromTMVarS tmvar () = runIdentityP $ forever $ do
a <- lift $ atomically $ takeTMVar tmvar
respond a
I will soon provide the above primitives in a pipes-stm
package, but use the above for now.
Then you just feed each Handle
to a separate MVar
and read from both concurrently:
main = do
(_, mStdout, mStderr, _) <- createProcess (proc "ls" [])
case (,) <$> mStdout <*> mStderr of
Nothing -> return ()
Just (stdout, stderr) -> do
out <- newEmptyTMVarIO
err <- newEmptyTMVarIO
forkIO $ runProxy $ hGetLineS stdout >-> toTMVarC out
forkIO $ runProxy $ hGetLineS stderr >-> toTMVarC err
let combine () = runIdentityP $ forever $ do
str <- lift $ atomically $
takeTMVar out `orElse` takeTMVar err
respond str
runProxy $ combine >-> putStrLnD
Just change out putStrLnD
with however you want to process the input.
To learn more about the pipes
package, just read Control.Proxy.Tutorial.
Upvotes: 6