(Replying to PARENT post)

I'm curious whether those with more professional exposure to rust are concerned about this or think it's noisy for no reason?

I'd also be curious if anyone has any good explanations as to why a choice like this would be made?

👤Icathian🕑2y🔼0🗨️0

(Replying to PARENT post)

We mainly use high level languages like TypeScript or Python because they work well for us. We do use C/C++ from time to time though, both for high performance requiring bottlenecks and for embedded since we do work with solar plant technology. A couple of years ago we looked into Rust and found it a combination of nice and wanting. It’s sort of great for low level programming and not really worth your time for anything else. Or at least that’s what we decided. We did adopt it and build a few solar inverter applications with Rust where we might’ve used C/C++ before, and it’s a decent language and nice to work with. But then a few months back, or maybe it was longer, some drama hit the ecosystem. Again it was so unimportant to us that I can’t even begin to remember what it was about, but similar to this Serde “drama” it speaks about an immature ecosystem and community. From a purely risk management offset we decided to stop using Rust for the time being. Which probably sounds more dramatic than it is, because we don’t really care about a lot of these issues and we’re probably never even going to be affected by them. But why would we want to spend time and energy on a technology which might change a lot? Well, the quick answer is that we won’t. Take this Serde issue as an example, it’s a very used package. I’m not sure exactly what people use it for, I assume that it’s for things we would likely build in Python, but even with 170 million users it’s main maintainer seems to be an issue. I even agree with the maintainer, but if the Rust ecosystem doesn’t have someone who can fork it and maintain a non-binary version to meet the wants and needs or all these noise makers, well then that’s not exactly a mature ecosystem.

So I think the answer is that it’s mostly just noise, but it’s noise you probably don’t want to have to deal with unless you’re invested in Rust itself.

👤devjab🕑2y🔼0🗨️0

(Replying to PARENT post)

The tinfoil hat explanation is that the developer is breaking all projects that depend on the default package manager cargo, forcing them to move to buck or bazel - projects that the developer of serde-derive has invested work into.

This is not noisy for no reason - the expectation is all dependencies are compiled from source on your computer, and here the developer forces the use of a binary program, with no alternative.

👤dcan🕑2y🔼0🗨️0

(Replying to PARENT post)

Not a Rust developer by trade, but from the offending pull request (https://github.com/serde-rs/serde/pull/2514#issue-1810688422) it seems to offer 10x compile performance on first compile with small 3ms overhead on incremental compile.

Is it worth it? Maybe. Downloading blobs isn't ideal but could be made easier in the future. Ideally a hack like this wouldn't be needed and we'd have comptime serde.

👤Ygg2🕑2y🔼0🗨️0

(Replying to PARENT post)

> I'd also be curious if anyone has any good explanations as to why a choice like this would be made?

It has to do with how Rust builds are structured.

The compilation unit is the crate, and the result of compiling it depends on the source code, the version of hte compiler, the compiler options, the target triple, and the set of features (think #ifdef FEATURE_FLAG).

Since there's exponential blowup of possible combinations of a single crate, the largest package registry (crates.io) does not cache compiled crates. It only caches their metadata, and when Cargo (the package manager) does dependency resolution, it uses that metadata alone - and your build system (also Cargo, most of the time) will need to actually build all those artifacts.

Seems pretty reasonable, right?

There's a special kind of crate called a proc_macro crate (procedural macros). When you use these crates you're not adding their code to your compiled output - you're creating a compiler extension - the compiled crate is loaded by the compiler, and then used to generate code in your crate, which is then compiled by the compiler.

One of the reasons Rust build times (particularly clean builds) is that they first have to compile all these compiler extensions and then use them to generate the code that the compiler compiles. These crates are super useful however, like serde, which generates serializers/deserializers that can be used for a wide variety of encodings (XML, JSON, TOML, various binary encodings, and so on).

It would seem that the same rules apply for normal crates - there's exponential blowup of compiler version, target triple, and features - but that's not really true. proc_macro crates usually don't have features, the compiler ABI is stable enough that they probably won't break across versions, and almost everyone and their mother is compiling on a handful of host systems (x86_64-linux being the most important one).

So while it's true that crates-io probably shouldn't cache every crate for every permutation of compiler/target/feature/etc, a few crates are so pervasive and used in such a sparing manner that it should be possible for the registry to host those

In fact some build systems can do this automatically. But Cargo can't. But it does have a pre-build feature (build.rs) that works like proc macros, and can pull down a pre-compiled proc macro crate, so widely used ones can use this hack to get around limitations of cargo.

My personal take is that the reason that crates-io and Cargo haven't been upgraded to do this, despite years of people complaining, and known solutions existing, is mostly personal/political. There aren't many people willing to put in the work to do it, and the teams haven't been open to contribution of this kind for a long time. For example, they do not believe that Cargo is a build system, and a lot of this is "out of scope" for cargo.

👤duped🕑2y🔼0🗨️0

(Replying to PARENT post)

It’s all in the article.
👤adastra22🕑2y🔼0🗨️0