Well, I didn't mean an actual MapReduce cluster, that's just silly. I was just pointing out that Haskell compilers can convert trivially convert these sorts of things to multicore code. Parallel Haskell basically does this, converting normal code to easy multicore code.
My point wasn't about using Haskell for high performance computing, my point was that Haskell's compiler can convert normal Haskell code to multicore code without (much, if any) work from the programmer. There is a large gap in between sequential single threaded program and a full MapReduce cluster and I think there is a significant amount of software written in this gap where Haskell-like languages could help with the parallelism/concurrency. The only reason I referenced MapReduce is since its a well known example even for people who are not well versed in functional programming (its a very common concept in functional programming).
My point wasn't about using Haskell for high performance computing, my point was that Haskell's compiler can convert normal Haskell code to multicore code without (much, if any) work from the programmer. There is a large gap in between sequential single threaded program and a full MapReduce cluster and I think there is a significant amount of software written in this gap where Haskell-like languages could help with the parallelism/concurrency. The only reason I referenced MapReduce is since its a well known example even for people who are not well versed in functional programming (its a very common concept in functional programming).