(Replying to PARENT post)
If we're talking about a longer format, such as a book, then we might consider digging deeper and implementing as much as possible using the barest of Python requirements. Indeed, Joel Grus does implement everything from scratch in his great (although a bit dated) book https://www.amazon.com/Data-Science-Scratch-Principles-Pytho....
EDIT: This is still a work in progress (and relies on numpy and matplotlib), but here is my version: https://github.com/DataForScience/DeepLearning These notebooks are meant as support for a webinar so they might not be the clearest as standalone, but you also have the slides there.
(Replying to PARENT post)
The problem is it's extremely hard to make it efficient. Dozens of men-years are spent trying to optimize linear algebra libraries. There are handful linalg libraries that have competitive performance. It was my college project to make a fast linalg library, and boy it is fast. There are some things like matrix multiplication that if you implement in C with the trivial algorithm, takes >2 mins but with some tricks you can make it as fast as <second (vectorization, OpenMP, handwritten assembly, automatically optimized code, various optimizations, better algorithm.....).
So, if you want to implement linalg in some language and compile it, go ahead, more power to you. But it's basically impossible to do it efficiently. My opinion is: this is fine and we should do this. There should be linalg libraries written in pure python (and are 1000x slower than lapack) but just understand that it's impossible to satisfy all use cases of numpy this way (at least currently).
(Replying to PARENT post)
If you're not used to work with matrices simply reading the Wikipedia article might tell you enough to implement them yourself.
(Replying to PARENT post)
(Replying to PARENT post)
I'm a C# developer and I'm sure it would take me all of about 30 seconds to install a matrix multiplication package through nuget. I'm sure it would be immediately obvious how to add items to matrices or do a dot product.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
So much learning that we're missing by not going through this step.
(Replying to PARENT post)
It's a legitimately valid part of machine learning, and its not easy to do for novices.
And I need help putting it on my badger damn it!