(Replying to PARENT post)
Because JavaScript was, and basically still is, the only option for client-side web development - that is to say, it was, and basically is, the only option for distributing sandboxed, instantly-updating executable code to users at any scale. Any other language requires me to convince users to download an application that runs with full privileges on their account. (For a while we also had Flash and Java, but then we realized that Flash and Java basically also imply full privileges, in practice, because their sandboxes don't work well.)
Therefore, JS got popular; therefore, people used it. This has very little to do with the quality of the language or its libraries. Note that I am not criticizing the quality any more than I'm praising it, just saying that it's irrelevant as long as it's good enough.
Basically the same thing happened with UNIX and C at a much smaller scale several decades ago, except it wasn't about client-side app development, it was about getting things to run on servers at all. UNIX isn't a fantastic OS. C isn't a fantastic language. But, in the words of Richard Gabriel's "The Rise of Worse is Better", "UNIX and C are the ultimate computer viruses."
https://www.dreamsongs.com/RiseOfWorseIsBetter.html
JavaScript and the dynamic web the new ultimate computer viruses. But they grew much, much more quickly, and haven't had the benefit of the last forty-ish years to get decently good. UNIX and C in the middle of the UNIX wars were at least as much of a disaster.
(Replying to PARENT post)
(Replying to PARENT post)
(I don't really do PHP, but I have a feeling that I would find the experience closer to JavaScript from talking to colleagues and reading about it).
(Replying to PARENT post)
You're implying this happened to JS because of some merit. But it didn't, it was just sheer coincidence and bad luck (for most of us at least, including users).
(Replying to PARENT post)
I believe that part of the reason is the ease of access to JS development. It allowed an inordinate amount of poorly trained devs to enter into the job market. Because you can quickly whip up fancy UIs and show them to unsuspecting non-techie types, you can quickly earn a reputation as a "computer whiz".
I think there were many de facto ways of doing things before this, none of which were perfect, but some of which followed better design principles.
If you look at software industries where security and stability is of utmost importance, you will not find these newer frameworks in use - nor the languages they use. I'm thinking life critical systems here.
(Replying to PARENT post)
But something that I almost never hear: critics offering a candid explanation for why their platform / language / framework - which maybe wasn't perfect but CERTAINLY compared to JavaScript was awesome - did not become the de facto tool across as diverse a spectrum for delivering features / products / tools to users.
It's always fair to criticize so that we ask ourselves the hard questions. So, it's also worth asking ourselves if we are staring too close to the wall when we try to paint other languages as backwards and therefore bad (or at least a bad choice). Just maybe, the language feature bullet list in your head (and corresponding subtleties of implementation) isn't the thing that matters most. Maybe the thing that really matters is something not as easy to codify as a language tool chain, like human consensus.
Also, it is false association to suggest that quantity != quality is somehow particular to JavaScript. Infinite monkeys will produce infinite crap no matter what brand typewriter you give them.