Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming Books Media Book Reviews IT Technology

Learning Functional Programming through Multimedia 200

ijones writes "Andrew Cooke recently reviewed for Slashdot Purely Functional Data Structures, which is on my book shelf next to The Haskell School of Expression: Learning Functional Programming through Multimedia by Paul Hudak from the Yale Haskell Group. In his review, Cooke presented an overview of some available functional programming languages, such as OCaml, SML, and of course Haskell. Havoc Pennington once called Haskell 'the least-broken programming language available today.' Haskell is a purely functional, lazy, statically typed programming language. You can read more about Haskell itself here." Read on for ijones' review of The Haskell School of Expression.
The Haskell School of Expression: Learning Functional Programming through Multimedia
author Paul Hudak
pages 363
publisher Cambridge University Press
rating 9
reviewer Isaac Jones
ISBN 0521644089
summary Learn to program in a functional style in Haskell by implementing graphics, music, and robots simulations.

As the title implies, The Haskell School of Expression introduces functional programming through the Haskell programming language and through the use of graphics and music. It serves as an effective introduction to both the language and the concepts behind functional programming. This text was published in 2000, but since Haskell 98 is the current standard, this is still a very relevant book.

Haskell's standardization process gives us a window into two different facets of the community: Haskell is designed to be both a stable, standardized language (called Haskell 98), and a platform for experimentation in cutting-edge programming language research. So though we have a standard from 1998, the implementations (both compilers and interpreters) are continually evolving to implement new, experimental features which may or may not make it into the next standard.

For instance, the Glasgow Haskell Compiler has implemented a meta-programming environment called Template Haskell. Haskell is also easy to extend in directions that don't change the language itself, through the use of Embedded Domain-Specific Languages (EDSLs) such as WASH for web authoring, Parsec for parsing, and Dance (more of Paul Hudak's work) for controlling humanoid robots.

Before we get too far, I should offer a disclaimer: The Haskell community is rather small, and if you scour the net, you may find conversations between myself and Paul Hudak or folks in his research group, since I use some of their software. That said, I don't work directly with Hudak or his research group.

In fact, the small size of the Haskell community is a useful feature. It is very easy to get involved, and folks are always willing to help newbies learn, since we love sharing what we know. You may even find that if you post a question about an exercise in The Haskell School of Expression , you'll get a reply from the author himself.

I consider this book to be written in a "tutorial" style. It walks the reader through the building of applications, but doesn't skimp on the concepts (indeed, the chapters are meant to alternate between "concepts" and "applications"). In some ways, the code examples make it a little difficult to jump around, since you are expected to build upon previous code. The web site provides code, however, so you can always grab that and use it to fill in the missing pieces.

For readers who wish to use this book as a tutorial, and to implement all of the examples (which is highly recommended), I suggest that you grab the Hugs interpreter and read the User's Guide while you're reading the first few chapters of The Haskell School of Expression. Hugs is very portable, free, and easy to use. It also has an interface with Emacs. Unfortunately, some of the example code has suffered from bit-rot, and certain things don't work out-of-the-box for X11-based systems. The bit-rot can be solved by using the "November 2002" version of Hugs. This is all explained on SOE's web page.

The Haskell School of Expression should be very effective for programmers who have experience in more traditional languages, and programmers with a Lisp background can probably move quickly through some of the early material. If you've never learned a functional language, I highly recommend Haskell: Since Haskell is purely functional (unlike Lisp), it will more or less prevent you from "cheating" by reverting to a non-functional style. In fact, if you've never really looked at functional programming languages, it may surprise you to learn that Haskell has no looping constructs or destructive assignment (that is, no x = x + 1). All of the tasks that you would accomplish through the use of loops are accomplished instead through recursion, or through higher-level abstractions upon recursion.

Since I was already comfortable with recursion when I started this book, it is hard for me to gauge how a reader who has never encountered recursion would find this book's explanation of the concept. The Haskell School of Expression introduces recursion early on, in section 1.4. It is used in examples throughout the book, and if you follow along with these examples, you will most certainly be using it a lot. The introduction seems natural enough to me, but I note that Hudak does not give the reader any extra insight or tricks to help them along. Not to worry, though; recursion is very natural in Haskell and the reader may not even notice that they are doing something a little tricky.

The use of multimedia was a lot of fun for me, and should quickly dispel the myth that IO is difficult in Haskell. For instance, Hudak has the reader drawing fractals by page 44, and throughout the book, the reader will be drawing shapes, playing music, and controlling animated robots.

Any book on Haskell must be appraised for its explanation of monads in general and IO specifically. Monads are a purely functional way to elegantly carry state across several computations (rather than passing state explicitly as a parameter to each function). They are a common stumbling block in learning Haskell, though in my opinion, their difficulty is over-hyped.

Since input and output cause side-effects, they are not purely functional, and don't fit nicely into a function-call and recursion structure. Haskell has therefore evolved a way to deal safely and logically with IO through the use of monads, which encapsulate mutable state. In order to perform IO in Haskell, one must use monads, but not necessarily understand them.

Some people find monads confusing; I've even heard a joke that you need a Ph.D. in computer science in order to perform IO in Haskell. This is clearly not true, and this book takes an approach which I whole-heartedly agree with. It gets the reader using monads and IO in chapter 3 without explaining them deeply until chapters 16 (IO) and 18 (monads). By the time you get there, if you have heard that monads are confusing, you might be inclined to say "how is this different from what we've been doing all along?" Over all, I was pleased with the explanation of monads, especially state monads in chapter 18, but I felt that the reader is not given enough exercises where they implement their own monads.

If you're worried that drawing shapes and playing music will not appeal to your mathematic side, you will be pleased by the focus on algebraic reasoning for shapes (section 8.3) and music (section 21.2), and a chapter on proof by induction (chapter 11).

After reading this book you will be prepared to take either of the two paths that Haskell is designed for: You can start writing useful and elegant tools, or you can dig into the fascinating programming language research going on. You will be prepared to approach arrows, a newer addition to Haskell which, like monads, have a deep relationship to category theory. Arrows are used extensively in some of the Yale Haskell group's recent work. You will see a lot of shared concepts between the animation in The Haskell School of Expression and Yale's "Functional Reactive Programming" framework, Yampa. If you like little languages, you'll appreciate how useful Haskell is for embedded domain-specific languages. It may be even more useful now that Template Haskell is in the works. Andrew Cooke described Purely Functional Data Structures as a great second book on functional programming. In my opinion, The Haskell School of Expression is the great first book you're looking for.


You can purchase Learning Functional Programming through Multimedia from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

This discussion has been archived. No new comments can be posted.

Learning Functional Programming through Multimedia

Comments Filter:
  • by Savatte ( 111615 ) on Tuesday March 16, 2004 @01:16PM (#8580328) Homepage Journal
    but chugging mountain dews and microwaving hot pockets are things that have to be learned through experience. Books are always a great source, but the lifestyle behind the teachings of the book can't be taught.
  • by Ben Escoto ( 446292 ) on Tuesday March 16, 2004 @01:17PM (#8580346)
    I've read this book and think this is a good book for learning Haskell (perhaps the best one) and that it explains monads well.

    However, it won't get the reader fully up and running as a productive Haskell programmer, because for that it is basically required that you master the GHC's (Glorious/Glasgow Haskell Compiler) standard library. Otherwise you won't know how to use a fast unboxed array, etc. This library is actually intelligently designed, but it is poorly documented, and there are lots of quirks for people who aren't totally familiar with Haskell yet. The best way to learn still seems to be to read the Haskell newsgroup and look at other people's code.

    But Haskell is an extremely interesting language and well worth learning IMHO.
    • Very cool! I've been into functional programming a lot, but have never been able to get my mind fully around monads. All of the documentation available seems to assume that you already know what they are.

      Is there any good stuff on monads for beginners on the web?
      • by jovlinger ( 55075 ) on Tuesday March 16, 2004 @02:05PM (#8580852) Homepage
        Ask again in comp.lang.functiona;l the boffins there are pretty good at explaining this precise question (a VFAQ).

        For now, think of monads as bathroom fixtures and monad combinators as plumbing.

        You COULD implement variables with state by creating a purely-functional assoc-list and explicitly threading it through your code. The state monad does that for you automatically, by forcing you to write your code as higher order functions, and by linking up the plumbing for you behind the scenes.

        As a much simpler example, consider exception handling, were instead of returning a value of type t, a function that can fail returns the type (Maybe t), indicating failure or a returned result.
        datatype Maybe t = Fail | Ok t
        you could write code like so:
        case (funone ...) of
        Fail -> Fail
        | Ok r -> case (funtwo .. r .. ) of
        Fail -> Fail
        | OK s -> ....
        All that fail handling gets old pretty quick. Instead, wouldn't it be nicer to write
        runM (bindM (funone ...)
        (\r-> bindM (funtwo .. r ..)
        (\s -> ... )))
        In haskell, this can also be written:
        runM (funone ... `bindM` \r->
        funtwo .. r .. `bindM` \s ->
        ...)
        (or also using do syntax, but I really don't like that)

        bindM is a monad combinator, the (\x -> ...) are monads. runM encapsulates the whole monad as a normal value.
        unitM :: a -> Maybe a
        bindM :: Maybe a -> (a -> Maybe b) -> Maybe b
        runM :: Maybe a -> a
        these are normal functions, no magic needed (written here using case rather than pattern matching to highlight similarity to above code)
        unitM r = Ok a
        runM m = case m of
        Fail = error "oops!"
        | Ok r = a

        bindM m f = case m of
        Fail = Fail
        | Ok r = (f r)
        (./'s ecode tag is Borken, I think)

        ... anyway, so what? The cool thing here is that in the case of IO (where instead of doign exception pattern matching, the monad combinators pass a representation of the whole external world), we can prove that the code inside the monads cannot ever directly access the world representation. Thus, the world is single-threaded through the computation. This is good, because then we can skip passing a representation, and just do the IO operations directly on the real world. So we can actually do side-effecting operations in a pure language.

        if this doesn't seem like magic, you are either much to smart, or have missed something.

        • A better book for functinoal programming is actually one about python.Its called http://www.amazon.com/exec/obidos/tg/detail/-/032 1 112547/qid=1079471944/sr=8-1/ref=sr_8_xs_ap_i1_xgl 14/102-3354031-2256126?v=glance&s=books&n=5078 46 Text Processing with Python and while it only deals with 'text'. The author explains that tends to cover almost everything we do with computers these says and goes on to explain that pyhton is kind of schizophrnic in that while it is ver object oriented, It is also ver
        • Geez. Here I thought perl was hard to read.
    • by Godeke ( 32895 ) * on Tuesday March 16, 2004 @01:35PM (#8580553)
      Herein lies the rub in getting adoption of any of the more "esoteric" (read: not procedural) languages into the mainstream: libraries that require "understanding" of the functional model. After spending years interviewing programmers I can safely say that most barely remember the functional languages they were taught (if they were taught). Try to then force them to use an "alien" library that works in a functional way, and you might as well ask them to chop their arms off and thresh wheat.

      I sometimes suspect that .NET may be the only hope of getting functional programming adopted by the maininstream. Currently the CLI has limitations that hamstring functional languages, but Microsoft has actually been bothering to try to rectify those problems. If they do, I would *love* to run ocaml or Haskell with the .NET infrastructure to back up the boring routine work. For that matter *any* major libarary of functionality accessable to a functional language.
      • I haven't really given it much thought, but I've heard claims that functional libraries are designed very differently from imperative libraries and therefore .NET's (which is, lets face it, designed for C#) will not fit perfectly with CLI functional languages. Of course a functional-style wrapper around the .NET libraries could be written, but then we might as well be using the meager native functional libraries we already have.

        Do you have any feeling for how well F# or H# will get along with the .NET API?
        • Playing with F#, the entire library is accessable. Obviously, in some ways using these libraries funnels those sections of the program into a more procedural mind set, but other sections work very well.

          C# and F# interop [microsoft.com] has a link on how to call C# from F#. However, it is interesting that F# uses some of the OCaml library [microsoft.com] *in addition* to the .NET libraries, as obviously OCaml has functionality specifically for manipulating functional structures, which is still valuable for F# programmers.
      • I thought that the CLI security model was inherently incompatible with the sort of low-level stack and closure manipulation you need to either get tail-recursion of laziness.
      • by smittyoneeach ( 243267 ) on Tuesday March 16, 2004 @02:45PM (#8581272) Homepage Journal
        One interesting point on the haskell website was that SQL is almost a functional language.
        I, for one, do all kinds of wild data gymnastics in SQL, and wonder if more could be done to 'get people in the door' with SQL DML statements as a lead-in.
        Always wanted to do more with functional programming, wondering if EMACS Lisp might prove more immediately fruitful...
        • by bringert ( 520653 ) on Tuesday March 16, 2004 @05:00PM (#8582854)

          You might want to have a look at HaskellDB [sourceforge.net] which is a Haskell library for writing statically checked queries using a relational algebra-like syntax. It lets you write things like:

          r <- table time_reports
          u <- table users
          restrict (r!userid .==. u!uid)
          project (last_name << u!last_name # activity << r!activity)
        • by Godeke ( 32895 ) * on Tuesday March 16, 2004 @05:20PM (#8583063)
          SQL is a declarative language, which is probably why it feels functional. Declarative languages allow the user to specify *what* you want, and then the underlying engine determines *how* to get it. SQL in some ways is closer to PROLOG, which is a declarative logic language (in fact, it is trivial to create SQL like queries in PROLOG).

          Functional languages can implement declarative syntaxes easily, but the real defining factor is that functions are "first order" objects, which can be applied, manipulated and passed. Frankly, if SQL had first order functions, many wild data gymnastics would be vastly simpler. I grumble at the lack of code reuse in SQL (and I say that as a big fan of the ease of use of SQL on the whole). For example, in SQL I must repeat my subquery every time I wish to apply it.(1) Calculations must be specified both in the select list and the conditional if they are used in both places, instead of being defined once and the results being available (some SQL dialects have workarounds for this). Recursion is right out (a hallmark of functional languages is heavy use of recursive functions) which makes navigating tree structures a total bear (PSQL has extensions for trees, but not very clean ones).

          Perhaps future developments that bring SQL closer to the true relational model (which has deep roots in set theory) which would make it possible to bring it closer to a true functional language as well. I think SQL would benefit wildly from the ability to define common structures (functions) and yet be able to apply the optimizer to the end result.

          (1) Footnote: T-SQL has "user defined functions", but the impose a nasty overhead because they are not part of the query optimization process.
          • by Anonymous Coward
            Most people think of a tree and implement it in SQL essentially as a one-way linked list, i.e., RecordID, ParentRecordID. These just are a PITA to navigate.

            Others (Joe Celko, for one "SQL For Smarties") have proposed that a L-R list would work better, in that instead of storing the connection points, you are storing the edges as L-R pairs. The only problem is updating the L and R keys on inserts and deletes would require either triggers or using stored procedures to maintain the key list, so this would bog
          • Actually user defined functions are pretty common. PGSQL has them as does mysql.
  • by karmaflux ( 148909 ) on Tuesday March 16, 2004 @01:18PM (#8580349)
    They've got a whole course [willamette.edu] online! FOR FREE!
  • by Anonymous Coward on Tuesday March 16, 2004 @01:19PM (#8580361)
    Haskell is perhaps best known for its use of the bottom operator. When parenthetised, its ASCII representation looks kinda like a...

    Well, I'll let you be the judge: (_|_)
  • by tmoertel ( 38456 ) on Tuesday March 16, 2004 @01:22PM (#8580407) Homepage Journal
    If you're interested, I recently gave a short talk about Haskell for the local Perl Mongers. The slides and notes are available online here: Haskell for Perl Hackers [moertel.com].

    If you want to see some Haskell code, I have some more concrete examples here:

    I have written a lot of little projects in Haskell. You can find some of them in links from my user info page [moertel.com].

    Also, one of the best resources on Haskell is the HaWiki: HaWiki [haskell.org].

    Do give Haskell a try. It is an amazing programming language.

  • I just can't understand why it's not used more. Personally I feel I was taught Haskell too early.. I was taught in my first weeks at uni before I even knew C. At the time I really struggled, all I knew about programming was BASIC and Haskell seemed alien and I hated it and dismissed it completely until recently. Now that I have much more experience of programming and functional languages I sometimes wonder why I ever touch C++ or Java. If I'd be taught Haskell in my penultimate or final year of universi
    • It seems strange to introduce functional programming early & then abandon it for imperative programming. The general trend, when introducing FP early is to carry it through the early part of the curriculum to help stress the importance of doing things right & somewhat separate the theory from 'real world' work (it also puts people on a more even basis and prevents the PHP/VB/HTML kiddies from being able to pretend they know everything comming into the program). Some good arguments can be made towa
    • I just can't understand why it's not used more.

      I'd guess because implementing recursive programs on a CPU isn't very efficient. The usual technique involves pushing the parameters on to the stack on every function call, leading to a stack that's full of data that's identical or almost identical. 'Destructive assignments' (i.e. i++) inside a loop are a much better match for the CPU's architecture.

      • I'd guess because implementing recursive programs on a CPU isn't very efficient. The usual technique involves pushing the parameters on to the stack on every function call, leading to a stack that's full of data that's identical or almost identical. 'Destructive assignments' (i.e. i++) inside a loop are a much better match for the CPU's architecture.

        You should probably consider looking at assembler output from ocamlopt before saying stuff like this. Recursive loops are very efficient there. Although thi
      • Most functional programming language implementations optimize tail-recursion, so it's not a problem. For example, in O'Caml, a non-tail-recursive factorial function looks like this:

        let rec fact n =
        if n <= 1 then 1
        else n * (fact (n - 1));;

        But, as you can see, the answer is kept on the stack, so that particular function can overflow. However, consider this one:

        let fact n =
        let rec fact_aux acc n =
        if n <= 1 then acc
        else fact_aux (acc * n) (n - 1)
        in
        fact_aux 1 n;;

        Since the answer is always k

      • I'd guess because implementing recursive programs on a CPU isn't very efficient.

        A naive seat-of-the-pants answer. I assume you're basing this on some old saw you've heard repeated somewhere>
        • A naive seat-of-the-pants answer. I assume you're basing this on some old saw you've heard repeated somewhere

          Nope. It was my experience of implementing a heuristic tree search algorithm for permutation groups in Prolog, back in 1988. The recursive version would run out of stack. I ended up using an iterative solution, as AFAIR tail recursion reduced the stack usage but not enough to stop it from running out.

          I'd be interested to see a way of implementing recursion (purely for my own entertainment) that

    • Ah, I don't really believe in any special 'formula' for learning. You either understand it, compartmentalize it, and use that understanding when you deem it appropriate... or you're missing something. Sure, a certain order can make this process easier, but in the end it shouldn't make a big difference. You're aware of your knowledge and your habits, and that's the first step. The next is adjusting them!

      Using this 'habits' reason just seems to me like an excuse. If it really is worth it, embrace it!
    • The problem is that haskell is lazy.

      yes, this is a real problem. I've spoken w/ some of the implementors, and they really thought that strictness analysis would get them a whole lot more.

      Lazyness sucks not so much for speed (but this is indeed an issue), but for interacting with foreign functions, and mostly because it breaks tail-recursion. You don't often notice, because haskellers tend to use programming idioms which don't rely solely on tail-recursion.

      It also makes predicting the performace of your
    • Speaking for myself: I've read about Haskell several times in the past three or four years and started reading some tutorials. The problems that I have is that the syntax is atrocious; it is worse than Perl!

      Maybe I'll try again after I finish playing around with Ruby ;)

  • by stuffduff ( 681819 ) on Tuesday March 16, 2004 @01:23PM (#8580432) Journal
    When I first started learning new languages I used to rewrite pong in them. It was very easy to 'see' if the code did what it should or didn't. That kind of feedback can really speed up the learning curve. I'm glad to see that the method hasn't been entirely lost.

    Today, if you don't have enough flashy multimedia to attract the user to stay and look at what you have to say, you never even get your foot in the door. Chances are that someone who has taken the time to learn to both use the technology and apply it in a meaningful way probably has something to say.

    With a generation of multimedia oriented programmers available I expect to see a much higher degree of interactivity in many different areas, from thing like mouse gestures to multi-dimensional navigation metaphores where we can simultaniously demonstrate our interests and our abilities so that we can arrive at the appropriate 'step' in whatever process we are trying to achieve.

  • Audience (Score:5, Insightful)

    by bobthemuse ( 574400 ) on Tuesday March 16, 2004 @01:25PM (#8580453)
    Since I was already comfortable with recursion when I started this book, it is hard for me to gauge how a reader who has never encountered recursion would find this book's explanation of the concept.

    Who is the target audience for this book? I would assume programmers, of at least moderate experience. It's not like there are thousands of script/VB kiddies jumping over themselves to learn functional languages. Makes me wonder, how many semi-experienced programmers are there out there who aren't comfortable with using/understanding recursive functions?
    • Re:Audience (Score:5, Informative)

      by Ben Escoto ( 446292 ) on Tuesday March 16, 2004 @01:36PM (#8580566)
      Who is the target audience for this book? I would assume programmers, of at least moderate experience.
      Actually, the book is targeted mainly at novices (although experienced programmers who have never seen Haskell will also learn). In fact, the author even mentions high school students. This is from the preface (page xvi):
      All the material in this textbook can be covered in one semester as an advanced undergraduate course. For lower-level courses, including possible use in high school, some mathematics may cause problems, but for bright students I suspect most of the material can still be covered.
      So obviously an intimate knowledge of red-black trees is not required.
  • Here We Again (Score:3, Interesting)

    by myownkidney ( 761203 ) on Tuesday March 16, 2004 @01:26PM (#8580465) Homepage
    I haven't got the exact figures, but I reckon 99% of all code written out there must be written in Imperative (sometimes pseudo OO) languages. There must be SOME reason functional languages are not so popular.

    Functional language are only good in theory. Sure, you can easily write programs in them, but they abstract over how the program is executed. And the programs are going to be executed in an imperative manner; machine code is imperative, remember?

    Thus, there's a MASSIVE performance loss when a functional programming language is executed on any of the existing processors. Because the compilers can't think and optimise the code to best fit the imperative model. Where as the human being s can. That's why we should stick to imperative programming languages.

    The day someone actually invents a function processor, we could start promoting these fringe langauges. Till then, let's keep Haskell as part of CS811

    Thank you for listening. That's the end of my rant.

    • Counter-Rant (Score:5, Informative)

      by Vagary ( 21383 ) <jawarren AT gmail DOT com> on Tuesday March 16, 2004 @01:36PM (#8580568) Journal
      OCaml, which is not purely functional but still closely related to Haskell, is nearly as fast as C. Haskell somewhat acts as a testbed for ideas in the ML language family, and future versions of OCaml are expected to include many features that were first implemented in Haskell. I'd also suggest that Haskell is a good introductory language for future OCaml programmers as it ensures they won't just try and use OCaml like a weird imperative language.

      OTOH, it is theoretically possible to automatically multithread purely functional programs, especially if they're lazy like Haskell. So it could end up being a very important language on multi-processor and distributed systems.

      Finally, Haskell has an excellent foreign function interface for when you need C-like performance and control.
      • Funny, that's exactly what teachers were saying about ADA in my time... ;)

        • In retrospect, don't most people admit that Ada was a really good language for its time? And you're right to be skeptical, but it's not impossible that I'm right. :)
          • You're right, ADA was probably one of the greatest languages of its time (and still is ;)

            Constructs like "task A requires B" are quite powerful indeed... too bad the only platforms this language is used for are missiles ;)

    • There are I think two main reason procedural languages are more popular.

      a) inertia. Till quite recently much programming was done in assembler, and procedural languages used to map much better to assembler code efficently. I would argue that as processors become more complex this is actually becoming increasing less true.

      b) It's much easier to hack out some program in procedural language and then hack bits of it randomly until it kind of works. Programming in functional languages requires more thought.

      Al
      • Does Haskal or oCamal have supported production grade ports to Palm OS or other small/embedded platforms?
      • Re:Here We Again (Score:2, Informative)

        by e aubin ( 121373 )

        b) It's much easier to hack out some program in procedural language and then hack bits of it randomly until it kind of works. Programming in functional languages requires more thought.

        This is nonsense. Functional programming almost always requires less thought because:

        They are garbage collected

        Programs tend to be much shorted

        There are fewer assignments

        Type systems catch a huge number of stupid mistakes

        Functional programming is only feels harder because your are forced to correct your errors

      • For example in C, imagine we have the following function:

        void foo(int& i) {
        if(i==1) g();
        if(i==2) h();
        }

        That's C++, not C. You cannot pass by reference in C. Plus we don't have to assume that the call to g() may have changed the value of i. The fact that you're passing it to foo() leads me to believe that it's not declared globally, outside of foo(), g() and h().

        Did you mean to pass i to g()? Also, what's your point? I don't get it.

    • Re:Here We Again (Score:2, Informative)

      by Tellarin ( 444097 )

      You forget at least two things,

      Functional languages are indeed used in production environments like Erlang from Ericsson for instance.

      And there used to be Lisp machines.

      So there are languages used in the "real" world and there "is/was" hardware available.
    • Re:Here We Again (Score:5, Interesting)

      by Waffle Iron ( 339739 ) on Tuesday March 16, 2004 @01:55PM (#8580747)
      I think that this choice of approaches to programming is similar to the choice in electrical engineering between solving problems in the time domain vs. frequency domain.

      To me, functional programming is similar to the frequency domain. There are certain problems that are almost impossible to solve in the time domain that are trivial to solve in the frequency domain. However, the frequency domain is harder to understand, and the real universe actually operates in the time domain. Moreover, some problems that are trivial in the time domain blow up when analyzed in the frequency domain.

      There few if any EEs who would advocate discarding all time domain calculations in favor of the alternative. That also applies to tools; few people would throw away their oscilliscopes just because they have a spectrum analyzer available.

      That's what bothers me about "pure" languages of any form. You're intentionally throwing away some of the available tools to prove a point.

      Sometimes a functional approach can provide an extremely powerful way to solve a problem with a tiny amount of code. However, sometimes another part of the same program would be better done in a more mundane fashion. The functional style's tendency to make you think about every problem "inside out" and to make you solve every problem in a "clever" way can get to be grating. I like to keep the option to use each style as needed, so I prefer languages that support features from a variety of programming styles.

      • by tmoertel ( 38456 ) on Tuesday March 16, 2004 @03:31PM (#8581752) Homepage Journal
        Waffle Iron wrote:
        That's what bothers me about "pure" languages of any form. You're intentionally throwing away some of the available tools to prove a point.
        Well, Haskell isn't throwing anything out. You can do both purely functional and imperative programming with Haskell. It's just that Haskell's designers went deep enough into the theory to come up with an elegant way to bring the functional and imperative worlds together (category theory and monads). So you need not give up the benefits of one to have the other.

        Take a look at this one-page TCP port scanner that I wrote in Haskell. [moertel.com] Imperative and functional styles mixed together, with neither sacrificing for the other.

        To use your time- and frequency-domain metaphor, Haskell is the well-educated EE who can use both kinds of analysis -- and slide between the two with ease.

      • Yes, and maybe this is a reason for looking at LISP and it's dialects?
        See e.g. scheme.

        It supports and *encourages* functional programming but also has "set!"-operators, is almost "syntax-free" and by using macros you can adapt scheme to whatever "programming paradigm" which will pop up in the next years.
    • I haven't got the exact figures, but I reckon 99% of all code written out there must be written in Imperative (sometimes pseudo OO) languages.

      There are lots of applications implemented in Excel spreadsheets. The stuff some people (who consider themselves non-programmers) do with Excel (or any other spreadsheet) is just amazing.

      Excel is a bit limited (especially if you don't use VBA), but it's certainly some form of functional, purely applicative programming.
    • Re:Here We Again (Score:5, Insightful)

      by Ed Avis ( 5917 ) <ed@membled.com> on Tuesday March 16, 2004 @01:58PM (#8580789) Homepage
      Thus, there's a MASSIVE performance loss when a functional programming language is executed on any of the existing processors.
      There's also a massive performance loss with many imperative languages when compared to C. Impure functional languages like OCaml or Scheme can give programs that run about half as fast as C when using a decent compiler. This compares favourably to Java or Perl (among others).
      The day someone actually invents a function processor, we could start promoting these fringe langauges.
      There have been Lisp Machines - okay, Lisp isn't a purely functional language but it is high-level and some of the same arguments apply. However, your point of view is a bit odd. There should be a processor with features mapping directly to language constructs before you can start using that language? People have tried this in the 1980s and earlier, with processors optimized for running Modula-2 or similar, but on the whole it turned out to be better to make general-purpose CPUs and write compilers for them. It's not as if current CPUs are a particularly good match for C's abstractions; a language with built-in lightweight parallellism could work well with hyperthreading, for example. In any case, even if the language and CPU are horribly mismatched and everything runs a hundred times slower, that could easily be fast enough for the application. CPUs are getting cheaper but programmers, on the whole, are not.

      Do you also discourage the use of Perl, or shell scripts, or Tcl, or Java? Or is it just functional languages that you don't like because they do not map to existing processors?

      • There are a few possiblities: Nemerle is functional version of C#. Felix is functional version of C/C++. Scala is functional version of Java with .Net version coming. Schlep translates scheme to readable C. Cyclone is a safer C with some functional features.
    • Re:Here We Again (Score:5, Insightful)

      by talleyrand ( 318969 ) on Tuesday March 16, 2004 @02:05PM (#8580859) Homepage
      I haven't got the exact figures, but I reckon 99% of all code written out there must be written in Imperative (sometimes pseudo OO) languages. There must be SOME reason functional languages are not so popular.
      Hmmmmm
      SELECT
      *
      FROM
      PROGRAMMING_TYPES PT
      WHERE
      PT.name NOT IN ('functional', 'imperative')

      >>> DECLARATIVE
      (1 row(s) affected)
      Yeah, I guess there's only two types of programming languages out there and clearly the previous code is only used in research/academic environments.
      • Well, if you want to get into it, functional languages are a subset of declarative languages. If you take Prolog and write a program where you only define one variant of each predicate (or use lots of cuts as the first term of a predicate) you get semantics strikingly similar to something like SML (without the typing).
    • Re:Here We Again (Score:5, Informative)

      by Ben Escoto ( 446292 ) on Tuesday March 16, 2004 @02:05PM (#8580861)
      Thus, there's a MASSIVE performance loss when a functional programming language is executed on any of the existing processors. Because the compilers can't think and optimise the code to best fit the imperative model. Where as the human being s can. That's why we should stick to imperative programming languages.
      You are exaggerating the performance penalty. See for instance the old version of The Great Computer Language Shootout [bagley.org] where Haskell is ranked faster than Java. Of course those benchmarks don't tell the whole story and should be taken with a grain of salt. In my experience though, Haskell is only about 4 times slower than C, compared to, for instance, Python, which is about 20 times slower (I am a big python fan, so this isn't a criticism of Python).

      Just as plenty of people are willing to put up with Python's slowness in exchange for better debugging, faster development, dynamic typing, etc., I think plenty of people would benefit by moving from C to Haskell, which is purely functional, has a great type inferencing system, never seg faults, etc.

      One final note is that Haskell programs can often be optimized in Haskell itself and approach C speeds. This is because Haskell is compiled and statically typed and can deal with unboxed elements and so forth. This is a big difference from Python and other dynamically typed languages where optimization sometimes must be done in C for best results.
      • Re:Here We Again (Score:4, Insightful)

        by jovlinger ( 55075 ) on Tuesday March 16, 2004 @02:55PM (#8581367) Homepage
        Optimizing Haskell by choice unboxings and strictness annotations is against the whole point of the language (*). More imporatantly, it is close to impossible for anyone but the compiler-writer to get right.

        Predicting how a lazy program will perform is hard, and figuring out where it hurts is even harder. This is in part due to the massive restructurings the compiler does. One small annotation may be sufficient for the compiler to infer a function's strictness. Knowing where to put the annontation, tho, is nigh guesswork. Then I refactor the function, and there goes my strictness again.

        But, this _is_ preferable to writing in C, I'll agree with you there.

        (*) However, I think the worker-wrapper transformation may be the most beautiful optimization I've ever seen.
      • Re:Here We Again (Score:5, Insightful)

        by dasmegabyte ( 267018 ) <das@OHNOWHATSTHISdasmegabyte.org> on Tuesday March 16, 2004 @03:43PM (#8581890) Homepage Journal
        Another thing to remember is that while processor speeds double fairly often, programmer speeds are a constant given a distinct set of tools. If picking a language that executes slower allows a programmer to write more software in a given period of time, then that language is superior choice for all but the most time sensitive applications.

        In other words, I don't want to waste time fucking around with pointers when I could be working on something more pertinant to the task at hand. I don't care HOW my matrix gets sorted. I just care that it does. If I waste some cycles, so what? This computer performs 53 MILLION in between each monitor refresh. If writing in a more abstract language permits me to get twice as much done per day of programming (and in my experience, it's more like 5-10 times), I'm willing wait.

        Besides, while computers may no be able to "think," code optimization is not reasoning as much as it is pattern based. In fact, modern refactoring tools are better optimizers than most programmers, because they know more of these patterns. And unlike human programmers, refactoring tools aren't tempted to glaze over modern day essentials like bounds or type checking, resulting in fewer bugs and better security.
    • Re:Here We Again (Score:2, Insightful)

      by Anonymous Coward
      Sorry, but here you are completely mistaken (and I am a bit amazed how this post got ranked so high). By using a reasonably advanced compiler for a functional language, and knowing how to write good functional code (highly nontrivial issue), one can typically get 50% - 150% of the performance of GNU GCC. Just look at the figures in the "Great Compiler Language Shootout".

      Yes, it is easy to write slow code in functional languages. Especially for beginners who do not know or care about the fine print. But tho
    • Troll?

      Anyways, ocaml is quite performant; when people compare its performace to that of C, people aren't really stretching the truth.

      Compiling strict languages like scheme and ML to machine code is possible. Doing it well is harder, but several very good compilers exist.

      Perhaps you meant _lazy_ functional languages? There are some bigger performance problems in that world, true; see a post above.

      Lastly, is performance the only benchmark we use for selecting languages? Is it even the most important one?
    • Re:Here We Again (Score:3, Insightful)

      by dubious9 ( 580994 )
      "Thus, there's a MASSIVE performance loss when a functional programming language is executed on any of the existing processors. Because the compilers can't think and optimise the code to best fit the imperative model. Where as the human being s can. That's why we should stick to imperative programming languages."

      If this discussion was 25 years ago it would have been "OO languages? They are only good it theory. Sure you can easily write programs in them, but they abstract over how the program is executed.
    • I haven't got the exact figures, but I reckon 95% of all operating systems out there must be a version of Windows. There must be SOME reason alternative OSes are not so popular.

      Alternative OSes are only good in theory. Sure, you can easily get a basic install up and running, but they depend too much on arcane instructions. The OS will ultimately be used by a human; humans are visual, remember?

      Thus, there's a MASSIVE usability loss when an alternative OS is used by any normal person. Because Windows is bes
    • I haven't got the exact figures, but I reckon 99% of all code written out there must be written in Imperative (sometimes pseudo OO) languages. There must be SOME reason functional languages are not so popular.

      This was also true of procedural languages versus assembly prior to the mid '70's, true of COBOL versus other languages prior to the early '80's, true of C versus OO languages prior to the mid '90's...

      It's not just for mutual funds that the past is a poor predictor of the future.

      Functional la

    • Totally false (Score:5, Informative)

      by Tom7 ( 102298 ) on Tuesday March 16, 2004 @04:59PM (#8582837) Homepage Journal
      I guess this is a troll, but I can't resist:

      Functional language are only good in theory.

      This is totally not true. I build real programs that do real stuff in SML every day.

      Thus, there's a MASSIVE performance loss when a functional programming language is executed on any of the existing processors.

      This is also completely false. Optimizing high-level languages is often easier, because there is more semantic information to exploit (types, higher-order code). My SML programs typically run about 20% slower than a C counterpart, while being much shorter, more frequently correct, and more secure.

    • Its easy to write fast computer programs.
      while(1){} will run extremely fast. Not a lot of use however.

      However what is difficult is to write large side effect free computer systems.

      Functional languages will do that for you efficiently. Don't get hung up with speed it is not always that important. If it is write the bits that are in C and assembler, but do the bits that are not in a higher level language.

      The day someone actually invents a function processor, we could start promoting these fringe langauges
  • way back when in college, the most interesting thing was that the program couldn't do I/O during the execution, only as an exit value. That makes useful daily programs difficult to write in a 'purely functional' language. The review talks about monads being a solution, but I can't see that putting something on the screen our worse a printer is something that can be undone. Therefore, I/O must be a side-effect, so how can a real 'purely functional' language like haskel do I/O?
    • It Works Like This: (Score:3, Informative)

      by Vagary ( 21383 )
      Short answer: IO is an exit value, just like you said.

      Long answer:
      Monads are a pattern for hiding a state that gets passed from a sequence of functions. For example, when you assign to a variable in an imperative language, the value of that variable in the implicit state is updated and all future phrases accessing that variable will get the new value. If you're using a Haskell state monad it works the same way, but you need to explicitly specify which phrases can be executed in the future (using sequencing
    • way back when in college, the most interesting thing was that the program couldn't do I/O during the execution, only as an exit value. That makes useful daily programs difficult to write in a 'purely functional' language. The review talks about monads being a solution, but I can't see that putting something on the screen our worse a printer is something that can be undone. Therefore, I/O must be a side-effect, so how can a real 'purely functional' language like haskel do I/O?

      Good question. There have be

    • Monads are like continuations, if you've ever programmed in that style with C.

      Functions are passed their usual arguments, plus an extra parameter which is the thing that should be called after this function has done its work. This function parameter can have a lot of extra stuff inside it which represents the 'state' of the system.

      It's rather like chopping your functional program into two parts: you have somthing a bit like an imperative programming language (but which is still pure functional) for th

      • Monads are like continuations, if you've ever programmed in that style with C.

        Functions are passed their usual arguments, plus an extra parameter which is the thing that should be called after this function has done its work. This function parameter can have a lot of extra stuff inside it which represents the 'state' of the system.

        You are mistaken, or at least explaining things in a confusing way. Contiuation passing style and monads aren't related. In Haskell you can using monads without using continu

    • > Therefore, I/O must be a side-effect, so how can a real 'purely functional' language like haskel do I/O?

      I/O is not a "side" effect, it is a specified effect, not something that happens "along with" some other value being computed. It is however not deterministic, so it's contained within the I/O monad. Nondeterminism is one of the things monads are good for.

      Basically any time you need something done in a sequence, whether it's executing "commands", reading input, producing output, or just producing
    • see earlier posts about monads.

      appart from monads, IO can be done either by continuations: (read (\x->E) as anonymous function taking argument x, with body E)

      (openfile "filename" (\openfileresult ->
      ... process filename contents ...
      (writefile "filename" somevalue (\writeresult ->
      if (resultOK writeresult) (print "done" (\printres ->
      ....

      This works because each IO function takes an additional argument -- what to do next. That way, there is no way to 'rewind' the computation.

      Another way

  • by Ben Escoto ( 446292 ) on Tuesday March 16, 2004 @01:30PM (#8580501)
    If you want to learn Haskell here are my suggestions in order:

    1. Why functional programming matters [chalmers.se] by John Hughes. An oldie but goodie, this can get you motivated to actually learn the language.
    2. Hal Daume's Haskell tutorial [isi.edu] is very complete, free and much better than the "Gentle Haskell Introduction" which isn't very gentle at all.
    3. The Haskell Language definition [haskell.org] is the official language description.
    4. GHC's library reference [haskell.org], which you will use constantly on anything non-trivial.
    5. The foreign language interface manual [unsw.edu.au]. Since Haskell has a small library you will probably need to call functions written in C a lot to get anything done. Luckily, Haskell's foreign function interface is quite nice.
  • New vs. Old (Score:3, Interesting)

    by moberry ( 756963 ) on Tuesday March 16, 2004 @01:30PM (#8580503)
    I think that there is a problem with newer programmers going into a language such as say, java, or C#. When i learned programming i learned C++ in ms-dos. We wrote our own string classes, used pointers, learned ADT's like, linked lists, and binary trees. Nowadays in java you write a program and use everyone elses stuff. there is a linked list class in Miscrosoft's .NET framework. Nothing is ever written from scratch anymore. IMHO you cant learn actual programming without getting into the nitty gritty your self.
    • Re:New vs. Old (Score:5, Insightful)

      by Abcd1234 ( 188840 ) on Tuesday March 16, 2004 @01:47PM (#8580656) Homepage
      Meh. A decent computing science course will have an extensive class in data structures, usually implemented in C, in which you'll likely cover at least linked lists and trees, followed by a second, more extensive course on general algorithms, in which you'll cover heaps and other more advanced data structures. If it doesn't, it ain't worth enrolling in. *shrug*

      Of course, after you've learned *how* a linked list is implemented, you should never have to roll your own. And if you do find yourself rolling your own, you should seriously question *why* before continuing, as there are many high quality, well tested implementations already floating around (for example, glib).
      • In my program, the algorithms class was at the start of second year, so many students were still not comfortable with pointers. As a result, most people spent so much time struggling with the implementations that they didn't get a chance to sit back and appreciate the structures.

        In Haskell, structures like linked lists and binary trees are implemented with a single line of code. Yes, you don't see the actual memory addresses being used, but these are also barriers to understanding.

        I'd advocate algorithm c
    • Re:New vs. Old (Score:3, Insightful)

      by ThosLives ( 686517 )
      Ah, I would say that "writing a program" by putting together other people's building blocks is not programming but code assembly. I would actually say that most "programmers" out there really don't know how to program, and that's why we have lots of the issues present today. It does take a clarification on how literal you want to be with the verb "to program" because people who stack up components are programming in the strictest sense. However, I fear that few and far between are the folks who can code a f
      • Re:New vs. Old (Score:4, Insightful)

        by pHDNgell ( 410691 ) on Tuesday March 16, 2004 @02:01PM (#8580820)
        Ah, I would say that "writing a program" by putting together other people's building blocks is not programming but code assembly. I would actually say that most "programmers" out there really don't know how to program, and that's why we have lots of the issues present today.

        Um, no. Writing a program is always assembling building blocks unless you always start by writing an assembler for your target hardware.

        The good programmers are the ones who assemble the correct building blocks the right way. The people who reinvent the linked list for every project are the ones who cause us the most problems (and yes, I've reinvented many linked lists in the past).

        Once you break free from the mentality that you must always make your own malloc(), printf(), hashtables, trees, linked lists, etc... you can move on to higher level issues like the actual application you're working on.
    • This is crap.

      First off, what do you consider "real programming?" I always felt it was the ability to take an input, process it, and create an output. You don't need to understand ANYTHING about data structures to do that.

      Or is it not "real programming" unless you avoid all of the convenient shortcuts which permit programmers to do things that are more complex than the addition and transformation of bytes?

      You don't need to understand the cellular biology behind tree growth to be a carpenter. I'm sure i
  • Languages (Score:3, Interesting)

    by benjiboo ( 640195 ) on Tuesday March 16, 2004 @01:30PM (#8580504)
    Do people think it's a good thing for a C++/Java/.NET programmer to go back to the drawing board for a few months and learn stuff like functional programming? I thought about coming up with a syllabus for myself of C, Haskell, LISP and Perl (which just evades me....)
    • Re:Languages (Score:5, Insightful)

      by pHDNgell ( 410691 ) on Tuesday March 16, 2004 @02:31PM (#8581104)
      Do people think it's a good thing for a C++/Java/.NET programmer to go back to the drawing board for a few months and learn stuff like functional programming?

      Absolutely. It's a tragedy to go through life as a programmer without knowing FP. The more you learn about programming in general, the better you will be a programming.

      I thought about coming up with a syllabus for myself of C, Haskell, LISP and Perl

      I always recommend against perl. Very few people understand perl (no matter what they tell you), and I've yet to see a significant perl program without several significant bugs because of lack of proper exception handling or ambiguity (stuff like if($var) where $var can be false on what would otherwise be considered valid input).

      I'd definitely recommend python instead if you want to learn some scripting (maybe ruby if you like stuff that looks like perl, but has a more reasonable philosophy).

      C, like any other assembler, is OK to learn, but shouldn't be used for much of anything except to write extentions in high level languages.

      Haskell is good. OCaml is good. Scheme is a good lisp derivative that's small enough to learn pretty easily.

      You might want to add smalltalk and/or objective C in there. Smalltalk is pure OO (the OO version of Haskell, if you will). Objective C is C with smalltalk-style OO. When combined with the NeXTSTEP frameworks, you can learn a lot of very useful patterns.

      A big part of functional programming is programming without side effects. Learning to program without side effects can greatly help you create more stable applications.

    • You might want to check out Scala [scala.epfl.ch]. It is a combination of OO and fuctional that compiles to Java byte code (and supposedly will support .Net compilation in a few weeks). It also lets you interact with existing Java/.Net code.
    • Book recommendations (Score:3, Informative)

      by dsplat ( 73054 )
      I thought about coming up with a syllabus for myself of C, Haskell, LISP and Perl (which just evades me....)

      I'd like to strongly recommend some books. The first is Modern C++ Design [moderncppdesign.com] by Andrei Alexandrescu. The second is On Lisp [paulgraham.com] by Paul Graham. In conjunction with that, you will need an introductory text on Lisp if you don't already know it and a good book on C++ templates. While I don't know what the best Lisp text currently in print is, I'd be willing to give Graham's ANSI Common Lisp [paulgraham.com] a try on the st
  • by Anonymous Coward on Tuesday March 16, 2004 @01:48PM (#8580674)
    If you liked

    "Learning Functional Programming through Multimedia"

    be sure to check out our new title:

    "Learning Esperanto through Yoga"
  • by cheezit ( 133765 ) on Tuesday March 16, 2004 @02:34PM (#8581129) Homepage
    This statement, from the 'more about Hskell' link:

    "Furthermore, malloc is fairly expensive, so programmers often malloc a single large chunk of store, and then allocate "by hand" out of this."

    I've seen this type of statement elsewhere in defense of non-C languages. And yet I've very rarely seen this done in code that wasn't either in 1) an embedded system or 2) a device driver or kernel module.

    In those cases where I have seen this in application code, it has been accompanied by lots of other newbie gaffes. I'd question the sanity of anyone who thinks that a user-level app will benefit from a hand-coded heap manager.

    But perhaps there are exceptions...does anyone actually do this routinely?
    • I wonder if not partially filled arrays fall into this category.

      Bit of a stretch, tho, I agree.
    • I think this practice goes beyond embedded systems, device drivers, and kernel modules. You'll find this sort of thing quite routinely in data structures (or classes, in the C++ case) implemented in libraries. I.e. some data structure, say a hash table, will heap allocate a big chunk, and use it up bit by bit - even when the structure itself could easily support incremental allocation. There are good reasons to do this sort of thing, having to do with allocation cost, heap fragmentation, and locality of
  • by Lulu of the Lotus-Ea ( 3441 ) <mertz@gnosis.cx> on Tuesday March 16, 2004 @04:31PM (#8582503) Homepage

    Lots of other comments, as well as the story, have pointed to a number of good tutorials and introductions. I'd like to recommend also the one I wrote for IBM developerWorks. I believe my tutorial is a bit better for real beginners to FP than are most of the others out there.

    Anyway, you can find it at IBM dW(free registration required) [ibm.com] or at my own Gnosis Software site [gnosis.cx].

  • by Paul Crowley ( 837 ) on Tuesday March 16, 2004 @05:13PM (#8582990) Homepage Journal
    Here's a simple C function which finds the inverse of a permutation, in linear time.

    void invert_permutation(
    int in[],
    int out[],
    int len
    ) {
    int i;

    for (i = 0; i < len; i++)
    out[in[i]] = i;
    }

    Write the same function in Haskell, in linear time. Use whatever representation of a list seems natural for you.

    I set this very simple problem to every pure functional programming enthusiast I meet, including several professors at a University known for that sort of thing. I've yet to hear a good answer...
    • by Anonymous Coward
      Trivial, even for a Haskell beginner like me. We'll use arrays, just as in the C example:

      invert:: Array a b -> Array a b
      invert ins = array (a,b) (zip (elems ins) [a..b])
      where (a, b) = bounds ins

      Builds a new array with the same dimensions (indices from a..b), initializes it with a list of tuples that gets created using "zip" which takes the elements of the input array as the indices of the output array. Note: the signature in the first line is optional.
      Nice, isn't it?

      • That you prefer this code to the C alternative, in terms of readability, maintainability, error-proneness, etc?

        Sure, I don't like using C for general-purpose programming as much as the next HLL programmer, but FP languages are simply mind-boggling, and no - that's not a good thing.
        • I don't think the OP's solution was the most transparent, but that's not Haskell's fault.

          It's simply wrong to assume that because you see code you don't understand when you're looking at an advanced language that you don't know, that those languages "are simply mind-boggling, and no - that's not a good thing".

          What those languages are, are different than what you're used to. There's a reason for that - what you're used to is largely an ad-hoc collection of crap that evolved before people understood what

    • by Anonymous Coward
      You're kidding, right? It depends, of course, on your representation of a permutation; let's take [(Int, Int)] as our permutation type. So the permutation on three elements that reverses it is [(1,3),(2,2),(3,1)].

      invert :: [(Int,Int)] -> [(Int,Int)]
      invert = map (\(a,b) -> (b,a))

      Now, if you want to use [Int] as a representation, so that the example permutation is [3,2,1], then you need to first add the indices (linear time), reducing it to the previous problem, and then extract the corresponding lis

    • > Write the same function in Haskell, in linear time

      Using unboxed types I could probably transliterate that code, but I'm kind of rusty in Haskell. I just find it more interesting that I could probably crash your process by passing -1 for len.

No spitting on the Bus! Thank you, The Mgt.

Working...