haskell - Preventing "getCurrentDirectory: resource exhausted (Too many open files)" error -


i trying run parsec parser on whole bunch of small files, , getting error saying have many open files. understand need use strict io, i'm not sure how that. problematic code:

files = getdirectorycontents historyfolder  hands :: io [either parseerror [hand]] hands = join $ sequence <$> parsefromfile (many hand) <<$>> files 

note: <<$>> function this:

(<<$>>) :: (functor f1, functor f2) => (a -> b) -> f1 (f2 a) -> f1 (f2 b) <<$>> b = (a <$>) <$> b 

i don't know parsefromfile function looks right (probably idea include in question), i'm guessing you're using prelude.readfile, @markus1189 points out includes lazy i/o. strict i/o, need strict readfile, such data.text.io.readfile.

a streaming data library pipes or conduit allow avoid reading entire file memory @ once, though- knowledge- parsec doesn't provide streaming interface allow happen. attoparsec, on other hand, include such streaming interface, , both pipes , conduit have attoparsec adapter libraries (e.g., data.conduit.attoparsec).

tl;dr: need following helper function:

import qualified data.text t import qualified data.text.io tio  readfilestrict :: filepath -> io string readfilestrict = fmap t.unpack . tio.readfile 

Comments

Popular posts from this blog

PHPMotion implementation - URL based videos (Hosted on separate location) -

javascript - Using Windows Media Player as video fallback for video tag -

c# - Unity IoC Lifetime per HttpRequest for UserStore -