Ask HN:

"Reasons behind renaissance of functional programming languages"

I've been curious about the renaissance of functional programming languages like Scala and Clojure in the software community. Is this a byproduct of our evolving hardware where instead of clock speeds the focus has been on pushing out more cores, so to reap the benefits software developers need better abstractions to make their code easily parallelizable? Is there some erlang-esque promise of pervasive immutability which is vaulting these languages into more popularity? Or is it simply the fact that these newer programming languages have simply learned from other languages' paint points and amplified the good parts by bundling them together neatly? I don't have anything against these languages, I'm just trying to get a deeper understanding of what appears to be a trend in the programming language space.
๐Ÿ‘คfizwhiz๐Ÿ•‘11y๐Ÿ”ผ101๐Ÿ—จ๏ธ89

(Replying to PARENT post)

There has always been an interest in these languages. What's changed is that using a functional language no longer cripples you. That is, the rise of service-oriented architectures, horizontal scaling, javascript, and REST APIs mean that it is no longer suicide to build your product using a functional language.

20 years ago you needed to write desktops apps, so you were limited to C/C++. Then 15 years ago you needed access to java libraries, and the only real JVM language was Java.

Now, you just need to build a service that speaks HTTP, so you just need a HTTP server, a HTTP client library, and a database adapter. That means you can build your product in any language you like, and not lack the ability to succeed.

CircleCI is written in Clojure. We communicate with all our vendors over simple REST APIs - it only took a few minutes to write our own client bindings for Stripe and Intercom.

๐Ÿ‘คpbiggar๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

I think Haskell has been growing a lot fueled by having a really great base language finally getting a pretty comprehensive standard library. This was a highly intentional group effort that began probably 10 years ago and pushed the Haskell libraries to a level of maturity.

I can't truly claim this has driven the adoption of other languages, but Haskell's (or pick some language from the ML roulette) influence on Clojure and Scala is undeniable. Clojure's commitment to immutability pales in comparison to Haskell's, while the latter demonstrates the advantages of such programming in spades. Scala's type system is a novel extension of the HM system which forms the basis of Haskell's.

So am I claiming that the increased availability and "practicality" of Haskell has driven a renaissance in functional programming?

Sort of, yes, but I don't feel it's the core underlying cause. More a symptom in its own right. These ideas are the leading edge of programming language research. They're smart to learn and to integrate into our next generation of tools. This evidence has long resounded from finding them at the center of the "Holy Trinity" of category theory, logic, and type theory. It's "practical" significance is still growing, however.

๐Ÿ‘คtel๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

As a counterpoint I'll explain why I don't use Haskell.

I like Haskell. I like it in theory and I like playing around with it. I like the discipline it requires having to re-think "everyday" things in order to solve them in a new paradigm. I started with Python and then learned half a dozen other imperative languages so Haskell is utterly alien to me apart from the most basic concepts like recursion. I liken it somewhat to learning Vim -- you make the simple complicated and gain something in the process. Haskell is the number one language I wish I was working in at any given time.

So why am I not using Haskell? Most of my time is spent on OS X and iOS. While I could shoehorn Haskell into a project it doesn't make business sense to do so. I couldn't justify the extra time, complexity and I certainly couldn't justify it to a client. There simply is no good answer to "what about the next developer?" and other related questions. (I think this is why shoehorning a paradigm into an existing language is much better approach when you have a tight coupling of language and platform -- I'm thinking of RAC here.)

Give me something to work on where Haskell is the obvious choice and I'd be all over it. Until then it has to stay in the toy box.

๐Ÿ‘คmbenjaminsmith๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

Here are some reasons I can think of:

* Parallelization and concurrency do seem to be easier in functional languages with immutable data. Not easier to write per se, but easier to reason about.

* JavaScript was, as far as I can tell, arguably the first truly mainstream language to embrace first-class functions and closures. The ubiquity of JavaScript has exposed many to the benefits of such an approach. Similarly, Rails and other Ruby-based software make lots of use of functional programming through Ruby's do-blocks. These languages have made people more comfortable with first-class functions.

* Languages like Haskell, Scala and Clojure have shown that functional programming can be high-performance and one need not necessarily trade speed for expressiveness. Similarly, many scripting languages (and/or the hardware they run on) have become fast enough to allow for them to be used for high-performance applications.

* Functional programming is fun, and allows for beautiful and succinct code.

๐Ÿ‘คthinkpad20๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

I'm going to argue against the idea that functional languages are flourishing as a result of the availability of parallelism. There are two reasons for this:

* Performance is not that big an issue for many applications. And if performance is not an issue, then neither is parallelism: the only reason to do parallelism is for performance (as opposed to concurrency, which makes sense even on sequential processors). So parallelism only really matters for performance-critical applications, because in most non-performance critical applications, you can squeeze out 10x by just running a profiler and writing better serial code.

* If you do want performance, parallelism alone is not sufficient. On modern processors, data movement is significantly more expensive than computation. And in that regard, most functional languages perform poorly. Imperative languages, for all their faults, give you much more fine grained control over memory usage patterns. So for performance-critical applications (which are again, the only applications where you really care about parallelism), it is not uncommon to see the core of the application written in C/C++, even if the rest of the application is written in something else.

By the way, if you aren't measuring your performance by comparing it against the peak compute/memory bandwidth of the machine, then you don't really care about performance, because you don't really have any idea what you're leaving on the table. This is why it's possible, for many applications, to give the code to an experienced performance programmer and see speedups in excess of 10x.

๐Ÿ‘คeslaught๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

IMO, it's something of a backlash against the rise of class-focused, object oriented languages (e.g. Java, C++, Ruby) leading to some complexity in the last decade.

Pendulum now swinging back the other way. Functional, data structure oriented programming has many benefits above old-school procedural programming e.g. C, Fortran. But it also doesn't have the baggage of mainstream OO languages. Thus, this feels like an appealing -- and new -- ecosystems.

This -- along with a rediscovery of some functional concepts that seem have more importance in a distributed setting, like immutable data, purity, higher-order functions -- have led to some renewed interest.

๐Ÿ‘คpixelmonkey๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

People have covered a lot of the rational reasons, so here's an emotional one.

I've been coding a bunch of Rust lately, and I'm happy to discover it has my favourite thing from Haskell: the "if it compiles it's usually correct" feeling is wonderful. My software feels rock solid to me, even while it's being heavily changed.

It almost certainly translates into more reliable software, but even just the feeling is enough to hook me.

๐Ÿ‘คbadsock๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

These come to mind (there are probably even more)

1) Complexity: Managing complexity is just easier with compact syntax, immutability etc. Applications are becoming larger and more complex, and languages that help reducing coupling and code size will become more popular than those that easily allow getting the job done at the expense of high complexity (e.g. OO/imperative).

2) Concurrency: With concurrent/multi-threaded applications being par for the course rather than something exotic, functional thinking is now required for many applications regardless of language.

3) Modularity/Decoupling/Interoperability: SOA and API's means you can develop independent bits of software that communicate in some language agnostic way (e.g. http). The CLR and JVM now both have support for functional languages that work well with their imperative siblings. This means you don't even have to be loosely coupled by http, you can write an application where the logical/computational bits are F# and the UI is C# while in the same application.

Apart from these we must not forget that almost all the large imperative languages (C++/C#/Java) have gained numerous functional constructs in recent years. Few of us can imagine working in an imperative language without lambdas these days.

So functional thinking has slowly been sneaking into the imperative programmers day job too. This has made more of us interested in these languages that consist almost only of these bits we find to be the most elegant in our OO languages. It's simply less scary. In 1998 the difference between Haskell and Java 1.1 was quite large. In 2014 the difference between C#5 and F#3.1, or between Java 8 and one of the functional JVM languages, is no longer big enough to make it scary.

๐Ÿ‘คalkonaut๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

I've read a lot of interesting things here, but no post really covered my perspective, so here it is:

Reason 1: Object Oriented Programming becomes less popular. After the 90s people started to see that OOP is not really the solution that scales to unlimited complexity. It was a step forward but nothing more. Many different things got tried, like Aspect Oriented Programming, but none of that really stuck.

Reason 2: The moment we had processors with more than one core we suddenly had a need for general purpose parallel processing. While the solution to this now seems to be using tools/frameworks like ZeroMQ, another reasonable option were languages with different paradigms, like functional.

I think that both these things happened nearly at the same time got a lot of traction to functional languages. I am following this since 2008, though. That functional languages still didn't make it to the top of popularity currently results in me no longer believing that they will ever be. Functional languages always had this kind of swingy popularity where a small group of people where strong hearted followers while the general opinion always swings between "this is ridiculous" and "maybe we could use it for that use case".

All in all I would say, the question is a great one, now that the current renaissance of Functional programming is pretty much over.

๐Ÿ‘คerikb๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

Cynical answer: The consultants have decided that they need a new snake oil to sell since the Agile bus is running out of steam.

Hopeful answer: FP languages have gotten enough better than the current mainstream that they can't be ignored.

I believe it is a bit of the Hopeful answer but mostly the Cynical answer. If you compare java and Clojure(or Scala for that matter) on the programming language shoot out Clojure is neither more concise nor faster than Java. So why all the hype then? As far as I can tell it is book sells and the fashion cycle.

๐Ÿ‘คstonemetal๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

I suspect there are three main reasons.

Limitations of hardware mean we're increasingly looking at computing in parallel, which is a natural fit for languages that emphasise immutability. Distributed computation also benefits hugely from immutable values, as it sidesteps much of the problem with cache expiry.

We're also seeing a demand for more reliable systems. Taking mutability out of the equation eliminates a significant area of possible bugs, and allows for more sophisticated type-checking.

The third reason is simply that it's only recently that we've gotten good functional languages with a comprehensive and mature set of libraries. Haskell has been around for decades, but when I first started learning it seven or eight years ago, the set of libraries on offer was very thin compared to Haskell today. Scala only popped onto the scene 11 years ago, and Clojure only 7 years ago.

๐Ÿ‘คweavejester๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

I would conjecture that your perception of FP languages making progresses is an echo of your own development or of people immediatly surrounding you, and that possibly, numbers would show that there is no increase in FP language adoption.

I'm not taking a stand here, just posing a possible alternative explanation. At least, I've thought about the OP's question and guessed it was a false generalization of my own progress. Data in (dis)favor of either.

๐Ÿ‘คAYBABTME๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

where would this "renaissance" be taking place? The most popular languages are java and C...
๐Ÿ‘คjesusmichael๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

It is a side-effect of larger memories.
๐Ÿ‘คdrudru11๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

RAM is no longer a bottleneck. So you can "spend" some RAM in the search for increased programmer productivity.

CPU is not really a bottleneck, either, so, you can also spend a few cycles there also.

๐Ÿ‘คpatrickg_zill๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

How about a boring theory, Internets.

It's just a subculture that couldn't gain critical mass without the internet. Like furries it has a wide but sparse appeal.

It is a subculture that's not simple though, it takes a while to get into, unlike being a furry you really need to do it in your job to be good at it hence it's only starting to hit it's peak.

๐Ÿ‘คaaron695๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

There is a lot more demand for tasks that FP is good at such as data processing where state is a means to an end, and hence the program can be stateless.

Contrast this with the 90s were we needed to write user interfaces where state was the point, or C++ gamedev.

๐Ÿ‘คfrozenport๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

A bit offtopic:

It would be great if there was a 10GHz CPU (single-core), for some special domains.

A hardware startup company that designs a new CPU that interrups the status-quo would be great. (3GHz in 2004, 3.8GHz in 2006, ~4GHz in 2014)

Like in the 1990s 3dfx (GPU) and Cyrix (CPU):

http://en.wikipedia.org/wiki/3dfx

http://en.wikipedia.org/wiki/Cyrix

๐Ÿ‘คfrik๐Ÿ•‘11y๐Ÿ”ผ0๐Ÿ—จ๏ธ0