Difference between revision 9 and current revision
No diff available.my personal opinion: there's a lot to like about Haskell. I list some of the cons at [1].
toread:
---
toread http://news.ycombinator.com/item?id=4721550
http://neugierig.org/software/blog/2011/10/why-not-haskell.html
---
"
−
> Haskell and Ruby have continuations
I wonder where this meme of Haskell having continuations started. It pops up here and there, like some mystical incantation that makes the speaker closer to the gods of programming languages.
Haskell has precisely the same continuations JS has, it's just capable of abstracting CPS away. Somewhat.
See http://hackage.haskell.org/pac... . "
"
DanWaterworth? 4 days ago
| link |
Haskell doesn't have first-class continuations, it has syntactic sugar that allows you to write continuation passing code in a direct style.
reply "
--- http://www.quora.com/Haskell/What-are-the-main-weaknesses-of-Haskell-as-a-programming-language
" What are the main weaknesses of Haskell as a programming language?
Jesse Tov, writes Haskell all day. 92 votes by Marc Bodnick, Omer Zach, Matthew Hill, (more) Haskell’s main technical weakness compared to popular, mainstream languages is probably the lack of a comprehensible cost model. Essentially, laziness makes it difficult to reason about when computation happens, and it can result in serious space leaks. While GHC can produce very efficient code, convincing it to do so in cases where that isn’t the default behavior is a dark art with few real masters. I’ve been writing Haskell for over a decade and using it as my main language at work for at least five years, and I’ve never learned to do this beyond simple cases—mainly because I don’t write code that needs to run efficiently.
There are some other things that are easy in many other languages but difficult in Haskell. For example, purity makes it slightly painful to discover and propagate run-time configuration information. In most languages it’s easy to read a configuration file at startup and store the result in a global variable, but in Haskell that requires dirty tricks.
Haskell has some other weaknesses compared to other academic/research languages. For example, its module system (if you can even call it that) is clearly inferior to Standard ML’s and OCaml’s module systems. Its compile-time meta-programming facility, Template Haskell, is occasionally useful, but it’s a poor substitute for a real, state-of-the art macro system. And of course, there are many things that some languages can do but others can’t. For example, if you want to prove your programs correct, Haskell’s type system, even with recent extensions in that direction, won’t let you prove as much as Coq or Agda—but that’s asking a language to do something that it wasn’t designed for, so I don’t really consider it a valid criticism.
From a theoretical perspective, some people have complained that call-by-need makes Haskell harder to reason about. It’s true that Haskell’s function space is a bit messier what you get in a language that’s both pure and total, but I don’t buy that it’s worse than your standard impure, call-by-value language—it’s just different.
Finally, Haskell does suffer some social weaknesses by being weird and researchy. Hiring large numbers of Haskell programmers is going to be difficult (though the ones you do find are likely to be very good). Because it’s so different from the languages that most people know, it presents a potentially steep learning curve. And of course, the community, while helpful and prodigious, is tiny compared to more popular languages. The variety and quality of libraries available on Hackage is impressive given the number of Haskell programmers out there, but it cannot compete with Java for mindshare. Finally, because Haskell is still very research-oriented, it changes quickly and sometimes unpredictably. I’ve found that my Haskell code breaks and needs to be updated not only for major GHC version increments but often for minor versions as well. This is annoying, but it’s a price I’m willing to pay to program in a beautiful language with top-notch abstraction facilities.
--
http://www.yesodweb.com/blog/2011/09/limitations-of-haskell
---
"Namespace clashing, particularly for record fields
data Record = Record {a :: String } data RecordClash? = RecordClash? {a :: String }
Compiling this file results in:
record.hs:2:34: Multiple declarations of `Main.a' Declared at: record.hs:1:24 record.hs:2:34 "
solution ideas:
http://ghc.haskell.org/trac/haskell-prime/wiki/TypeDirectedNameResolution
however a comment on http://www.yesodweb.com/blog/2011/09/limitations-of-haskell notes that typeclasses are a better soln (so we should have everythingIsATypeclass , just as i've been saying for jasper)
---
history via Peyton-Jones
http://research.microsoft.com/en-us/um/people/simonpj/papers/haskell-retrospective/index.htm
http://research.microsoft.com/en-us/um/people/simonpj/papers/history-of-haskell/
typeclasses, also with a useful intro to how the compiler implements typeclasses http://research.microsoft.com/en-us/um/people/simonpj/papers/haskell-retrospective/ECOOP-July09.pdf
---
--
" The second problem with Monads is related to their great strength - they are synonymous with Domain Specific Languages. The mission statement for domain specific languages is stupendous - don't write a complex program to solve your problem, write a simple program in a programming language you have designed solely for that task. I've already mentioned the best use of DSL/Monads - Haskell's parsec module. With Parsec the Haskell function to parse a file is identical to the Backus Naur Form description of the parse grammar - how much clearer could it be? They say that imitation is the highest form of flattery, and every parser function I have written outside of Haskell since meeting Parsec has been a shoddy facsimile of Parsec in the chosen language. The success of Parsec and its ilk has filled Hackage (the Haskell module repository) with hundreds of DSLs covering any task you care to mention.
Yes, literally hundreds of them. Hundreds of little programming languages, one for BNF parsing, one for parsing xml, one for creating PDFs, all perfectly suited to their task. Each is different, and each has its own learning curve. Consider a common task such as parsing an XML file, mutating it according to some JSON you pulled out of a web API and then writing to a PDF. In Ruby or a similar object oriented language you expect to find three APIs/gems, all with a similar object oriented syntax, but for three Haskell DSLs designed for three different tasks to share syntax implies that their authors failed to optimise them for those tasks, hence instead of five minutes with API documentation you have hours of DSL tutorials ahead of you before you can begin work. It is this subtle difference that plants Haskell on the academic side of the town and gown divide. With Haskell the focus is on the language, not the problems you are solving with the language. "
---
probs with laziness, esp. lazy I/O: http://users.aber.ac.uk/afc/stricthaskell.html
--
"
In Haskell it’s easy parallelize code, but performance is not only using your cpu cores. And it’s hard to write code that can have paragonabile performance to C++ code
Today a cpu silicon is invested principally in: cache
You solve one problem, but you regres in others: memory usage
Haskell promote a coding style where your data is sparse (terrible data locality), and even basic data types are boxed
Use a vector? Not idiomatic Haskell! You have to use some tree!! So your code will execute on all your core, but all your core will spend the majority on time on frontend stall! Not a great speedup "
--
" If people write custom allocators, is not because they are fools
And having a bad cache usage in a multi threading environment is also worser that in a single threads environment: MESI protocol will take even additional time
And in all of this, not a word on NUMA architecture, with will be probably predominant in the future. How Haskell can be made NUMA aware?
You are making it like “We don’t need to worry about low level details”…where is my api for cpu affinity? When a thread will be scheduled on another code, the data set has to be readed twice from the memory This matters in high performance code
If the OS expose api like that, is not because they are fools "
" Bartosz Milewski Says:
September 19, 2013 at 6:24 pm
@Nicola: I really recommend Simon Marlow’s book. It will answer your questions. Locality and unboxing is very important indeed and, at least for matrix calculations, is provided by the Repa library in Haskell. GPU code is produced by the Accelerate library. On GPUs you usually have to copy your data to private GPU memory, and Accelerated does it for you. These libraries are all about scalable performance. Parallel programming is pretty useless if it can’t speed programs up. "
" Edward Kmett Says:
September 19, 2013 at 8:17 pm
Nicola,
I’ve been slowly building up a set of cache-oblivious yet still purely functional data structures in Haskell. Ryan Newton has been working towards NUMA-awareness. I have a decent high performance lock-free work-stealing deque and tbb-style task monad. With GHC 7.8 we’re getting primops for working with SIMD, prefetching, etc. and filling in other gaps in the high-performance spectrum.
Moreover I would challenge that the ‘vector isn’t idiomatic haskell’ argument is a few years old. Nowadays vector, repa, and accelerate are all pretty well en-meshed in the culture.
That said, CPU affinity _is_ still something we don’t have a story for. It bites me too. HECs don’t tend to move around, but that isn’t the best guarantee.
I won’t lie and say we have all of these things today, but we’re not ignoring them. Haskell is continuing to evolve and borrow good ideas. "
--
" thirsteh 36 days ago
| link | parent | flag |
This is highly anecdotal, but I've built and been part of very practical/non-theoretical, large Haskell projects (100k+ lines, which is a lot for Haskell). The only big complaint I have is that it's somewhat hard to do loose coupling, i.e. for something somewhere to reference a type without either redeclaring the type (when that's possible), or having a huge, centralized Types.hs that declares all the types that are used in different places (to avoid cyclic imports.) (Contrast with e.g. Go or ML where you have interfaces/modules without 'implements'.)
This isn't unique to Haskell by any means, but it's the only real complaint I have about Haskell as a language for non-toy projects. The benefits definitely make it my go-to language. It's hard to list them all, but by far the nicest feeling is the correctness: when your code compiles, 60% of the time your program works every time. (Not to imply that tests aren't necessary--QuickCheck?