Failed revolution? Renaissance of FP, 3 yrs later

Roughly 2-3 years ago, we've all experienced the popularity outburst of so-called New Wave of Functional Programming languages. As everyone was rediscovering the FP paradigm ("re-" is important here as actually there were no new concepts, just a re-visit of something well known from the past) hype was sky-rocketing, new languages were being created almost on the weekly basis and the overall enthusiasm regarding how FP will change the industry was astonishing.

Today, the peak of the hype seems over - the initial excitement has passed, new products had some time to mature & prove the difference they were supposed to make, so we should be able to make at least a short summary:

  • what has really changed (thanks to FP rennaisance)?
  • did FP truly get (for good) into mainstream? if so, what kind of imprint did it leave?
  • have the initial promises been kept?

Here's what I think about it. I'd appreciate if you'd let me know (in the comments) whether you agree or not.

New languages didn't get industry by storm, but ...

... current big players have benefitted from the lesson

None of the new FP languages truly conquered the mainstream. The one that seemed closest to achieve that was Scala, but this was mainly due to stasis in Java (as a language) development -> Scala seemed a good fit for people who ... wanted to write Java outside of Java. Things have changed after Java 8 has introduced some elements of FP (lambdas) & (what is even more important) a credible & promising roadmap for future. Java is not a single case of OO language adopting FP features, actually many languages have got immutable data types (even JavaScript), futures, promises, folds, flatMaps & other chosen elements of FP - usually in forms of libraries, not language-level constructs.

Apart from Scala, other popular FP languages where either too eccentric (Clojure), too much behind their direct OO competitors (F#) or just didn't have much to offer except of the language itself - e.g. lacking platform ecosystem (Haskell, OCaml).

It appears that actually the languages that seem the most promising today (of the new contenders ofc) either are not FP at all (the safest bet for the future IMHO - Go) or don't bet on their FP descent (like Rust or Elixir). Being FP is not something worth bragging around anymore? Who would have thought ...

People appreciate FP paradigm, but ...

... don't care about theory (monads, monoids, lambda calculus in general)

Communities have quickly embraced FP goodness - increased testability was appreciated, developers really liked the fact it's easier to scale FP code out, etc. BUT, based on what I've experienced & heard from people around - devs are far more eager to utilize FP goodness when ... they don't have to get into lambda calculus at all.

Yes, even FP aficionados I've spoken to were not able to tell the difference between monad & monoid, barely anyone understood the value in algebraic type systems or was able to explain what is referential transparency. Devs approach is very practical - they are able to recognize a pure function or compose higher-order functions using fluent syntax once they've went through some examples. And they don't need advanced maths to put these into use - during last 24 months I've spent a lot of time with Elixir & its dev community - these people do shitloads of FP on the daily basis without even mentioning stuff like endofunctors, isomorphisms or category theory.

Language itself is not enough

Even the most carefully crafted, mathematically perfect language with the most sugared syntax is useless by itself. Peeps are not stupid, they've learned (by own mistakes, I presume) the importance of mature tooling, wide community support & strong platform with large ecosystem of existing libraries. Being forced to write even the most mundane stuff (HTTP connectivity, JSON serialization - such stuff) from scratch is very far from being cool.

Hence e.g. all the Haskell trolling ("best programming language never used in production") or popularity of Elixir (due to BEAM/OTP platform & its thrilling capabilities).

Simplicity is winning (again)

At some point I was totally delighted & 100% in love with Scala. It looked "almost too good" - all the JVM ecosystem available at my fingertips, OO + FP mixed together, unprecedented dose of very appealing syntactic sugar, Akka as a "platform seller" - what more could I ask for?

Unconditional love has lasted for about a year - the more I've used Scala the more I've seen how broken it is (& gradually becomes) - wicked sick syntax elements (like implicits), overdosed metaprogramming (try scalaz ...), too many different ways to do one, simple thing - due to all these reasons Scala became the Perl of 2010s - it's barely readable, hard to troubleshoot & ... over-concise ("assumed" boilerplate, countless incoherent conventions, etc.). These qualities may have been cool for challenging logic puzzles, but not for programming language.

Wanna examples? These are valid pieces of Scala code ...

(0/:l)(_+_)
// ...
implicit def KleisliCategory[M[_]: Monad]: Category[({type λ[α, β]=Kleisli[M, α, β]})#λ] = new Category[({type λ[α, β]=Kleisli[M, α, β]})#λ] {  
  def id[A] = ☆(_ η)
  def compose[X, Y, Z](f: Kleisli[M, Y, Z], g: Kleisli[M, X, Y]) = f <=< g
}
// ...
def orElse[AA >: A, BB >: B](x: => AA \/ BB): AA \/ BB =  
  this match {
    case -\/(_) => x
    case \/-(_) => this
  }
// ...
def |||[AA >: A, BB >: B](x: => AA \/ BB): AA \/ BB = orElse(x)  

Now try to confont the twisted monster Scala has became with the simplicity of Go. Go is ... boring. I mean - really boring (hence I prefer Elixir), but on the other hand it's EXTREMELY effective & potent while staying very clear, readable & ... natural in terms of expressing the programmer's intent.

Some concepts made the difference ...

... & will stay for longer.

The legacy of "2010s FP revolution" will prevail - some of the lessons we've all learned have a significant impact on the way we produce software (& will quite likely produce it in future):

  • immutability - devs have learned that it doesn't have to be as memory exhausting (when done smartly), while in the same time it makes a huge difference for scalability
  • data/behavior split & function execution pipeline'ing - especially for the sake of increased business logic testability
  • lambdas & higher order functions - the new way of business logic composition, after OO inheritance disillusionment

The funny thing is that the challenges there were here 3 years ago are still valid. We have to leave with the consequences of Moore's Law discontinuity, our infrastructure is far more multi-core than it ever was & services we're building get even more global as borders got truly virtual (at least in IT). One can just wonder what will be the next "silver bullet" to beat these challenges ...

Pic: © rosinka79 - Fotolia.com