(Replying to PARENT post)
It shows that Python can be fast enough if you leave the heavy computational parts to libraries written in other languages. Which is interesting, but doesn't say much about the speed of Python itself.
(Replying to PARENT post)
(Replying to PARENT post)
However, even "real world" anecdotes in this area can be a minefield.
Take, for example, an existing Python application that's slow which requires a rewrite to fix fundamental architectural changes.
Because you feel you don't need necessarily need the flexibility of Python the second time around (as you've moved out of the experimental or exploratory phase of development), you decide to rewrite it in, say, Go, or D or $whatever.
The finished result turns out to be 100X fasterβwhich is great!βbut the danger is always there that you internalise or condense that as "lamby rewrote Python system X in Go and it was 100X faster!"
(Replying to PARENT post)
If my C is 1000x faster and saves me 60 seconds every time I run the program, but takes an extra 2 days to write initially, and the program is seeing lots of edits meaning that on average I have to wait 2 minutes for it to compile then I am MUCH better off with the slower MATLAB until I am running the same thing a few thousand times.
Plus there is the fact that I can look at HN while a slightly slower program is running, so I win both ways.
(Replying to PARENT post)
(Replying to PARENT post)
Why would you use Numpy for arrays that small? Oh, looks like someone actually just wrote it in CPython, no Numpy, and it clocked in at 0.283s. Which is fine. It's Python.
This thread reminds me of the scene in RoboCop where Peter Weller gets shot to pieces. Peter Weller is Python and the criminals are the other languages.
(Replying to PARENT post)
Not that python is fast, it isn't. And using numpy seems a bit disingenuous anyways "Oh my python program is faster because I use a library that's 95% C"
(Replying to PARENT post)
If you enjoyed this Python optimization, you may also enjoy: http://stackoverflow.com/questions/17529342/
This sort of thing comes up a lot: people write mathematical code which is gratuitously inefficient, very often simply because they use a lot of loops, repeated computations, and improper data structures. So pretty much the same as any other language, plus the extra subtlety of knowing how and why to use NumPy (as it turned out, this was not a good time for it, though that was not obvious).
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
"How fast is the code produced by your compiler."
I keep seeing this misconception about languages vs implementations.
EDIT: Clarified what my original remark meant.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
Different things use different types of randomness. Some are fast. Some are slow. If your comparison is not using the same type of randomness, that comparison is comparatively useless.
(Replying to PARENT post)
If your problem is numerical in nature, you can call popular C modules (numpy, etc) or write your own.
If your functions and data are pickleable, you can use multiprocessing but run into Amdahl's Law.
Maybe you try Celery / Gearman introducing IO bottlenecks transferring data to workers.
Otherwise you might end up with PyPy (poor CPython extension module support) and still restricted by the GIL. Or you'll try Cython, a bastard of C and Python.
Python has been my primary language the past few years and it's great for exploratory coding, prototypes, or smaller projects. However it's starting to lose some of the charm. Julia is filling in as a great substitute for scientific coding in a single language stack, and Go / Rust / Haskell for the other stuff. I've switched back to the static language camp after working in a multi-MLOC Python codebase.