https://github.com/FFmpeg/FFmpeg/blob/master/libavutil/x86/x...
https://github.com/FFmpeg/FFmpeg/blob/master/libavutil/x86/x...
It looks like the isolated-vm package is the go-to, but understandably it prevents things like fetch or being able to import packages.
I’m thinking to use docker and have a single base image that exposes an API that will take an arbitrary string, check for and install imports, then eval (eesh) the code, but before going down the road of implementing it myself and going crazy over properly securing the containers I’m thinking that there has got to be some prior art. How are Codesandbox et al doing it?
Second off, I didn't realize how deep the dep tree would be for this type of program -- 141 total! So much of it is the url crate, itself a dep of the git crate, but there's a bunch of others too. I'm just getting into learning Rust -- is this typical of Rust projects or perhaps typical of TUI projects in general?
(EDIT to strikeout) ~~The binary is also 53M as a result whereas /usr/sbin/tree is 80K on my machine -- not really a problem on today's storage, but very roughly 500-1000x different in size isn't nothing.~~
Maybe it's linking-related? I don't know how to check really.
(EDIT: many have pointed out that you can run `cargo build --release` with other options to get a much smaller binary. Thanks for teaching me!)
cargo build --release
du -sh ./target/release/lstr -> 4.4M
Building with other release options brings it down to 2.3M: [profile.release]
codegen-units = 1
opt-level = "s"
lto = true
panic = "abort"
strip = "symbols"
The same parties voted in 2011 to introduce mass data storage, where all international internet traffic can be stored and kept for 6 months by the state.
I see no reason to believe that either party would protect the right to private communication or internet use.
Actually, the intensional model doesn't improve matters at all here, I was only pointing it out to demonstrate that the fact that binary components are addressed by their inputs doesn't really have anything to do with reproducibility. Of course the intensional model would mean that if you made the same build twice and it wasn't reproducible, then you'd get a different hash; however, that's not really an improvement over the current approach, which is to just build it twice and compare the output results. If anything, it just makes things more convoluted for reproducibility, due to the fact that you have to factor out self-references to check the hash.
The main advantage of the intensional model, as far as I know, is that it simplifies the trust model a bit. In the extensional model, you have to trust the substitutor, otherwise it could poison your Nix store. In the intensional model, derivations are addressed by their contents, so it's impossible to really poison the Nix store per-se, since you can definitely validate that the store path is correct for a content-addressed path.
Really though, it doesn't have a lot to do with reproducibility, and even in the work done in recent years I've not seen it mentioned at all in relation to reproducible builds, though I fully admit that it's very possible it's somehow useful and I just missed it.
> If the entire process of building an artifact is pure, then the artifact would be entirely reproducible, given that you have access to the same inputs.
That is true. Nix, though, explicitly only makes certain parts of the process pure, and the parts that it makes pure are specifically driven by the motivations outlined above. It is true that if you made the entire process completely pure, the build would be reproducible, and it is also true that Nix very intentionally does not try to do this, because it just simply wasn't in the list of problems Nix was solving at the time.
Likewise, though, you can still make a build reproducible without functional purity, which is exactly what has been done by various other reproducible build projects. They just happen to avoid impurities that would impact the result without any specific guarantees, which happens to be exactly what you have to do to make a reproducible build in Nix.
> Yes, there are many ways to introduce impurity, however claiming that Nix as a purely functional software distribution model, where the central point is to achieve purity, is fully orthogonal to reproducible builds, seems incorrect.
I don't know what "fully orthogonal" means relative to just "orthogonal". I am using "orthogonal" to mean "independent", i.e. what Nix solves is fully independent of reproducible builds. This follows because:
- It is possible to do reproducible builds without "purity" guarantees or sandboxing.
- It is possible to have builds that are not reproducible provided the purity guarantees of Nix.
The closest that Nix's purity comes to being related to reproducible builds is that it does, in fact, prevent some of the causes of unreproducible builds by enforcing the exact inputs, but that's why I consider it to be adjacent but ultimately orthogonal. If it's the word orthogonal that in particular is too strong, then maybe a better word would be "independent".
However, as the point of Nix is to achieve reproducible software deployments through a pure and functional description of the deployment, where it also provides mechanisms that systematically improves build reproducibility, I feel that orthogonal is misleading (as was my original disagreement) because reproducible builds correlate with Nix, and because achieving reproducible software deployment is a clear original goal of Nix, e.g. by seeking to remove implicit dependencies through mechanisms such as sandboxing of builds.
Sorry, but you pretty much quoted what I would've quoted to refute your claim: it's exactly right that binary components are uniquely defined by their declared inputs, but note the subtle consequence that has: they are not defined at all by their outputs, only their declared inputs. (Making them be defined by their outputs is entirely possible FWIW, this is pretty much what content-addressed Nix paths are for.) This also applies recursively: the bit-exactness of external inputs are guaranteed by cryptographic hashes, but if you have any inputs that are, themselves, derivations, you can extremely trivially add impurities, because the system is just simply not designed to stop you from doing this. An example:
stdenv.mkDerivation {
name = "trivialimpurity";
unpackPhase = "date>$out";
}
It would be quite possible to make a system that is specifically designed to be resistant to this; consider that Nix goes great lengths to accomplish what it does, literally ripping apart ELF binaries to force dynamic symbols to be resolved in a static manner. There's no reason why you couldn't go further to force deeper reproducibility in a similar vein, it's just that Nix simply doesn't.I think it's actually OK that Nix doesn't attempt to resolve this problem, because it is a pretty hairy one. Obviously you can't practically make the build environment 100% completely reproducible as it would be painfully slow and weird, so the best you could really probably do is intercept syscalls and try to make threading behavior deterministic (which would also make builds slow as balls, but less slow than running every build in unaccelerated qemu or something like that.) What Nix does do is solve the living shit out of the problem it was designed to solve, something you can feel very viscerally when you compare the state of Nixpkgs to the state of the AUR (a package repo I consider to be very good, but definitely can give some perspective about the problems Nix is designed to solve IMO, based on some of the problems I've run into with it.)
However, Nix specially attempts to facilitate a pure software distribution model, which is why it does everything it does, e.g. forcing all input files to be copied to the store and providing mechanisms such as the pure evaluation mode which restricts access to the current system and time information. Yes, there are other ways to introduce impurities, but Nix tries in many ways to systematically remove these sources to increase the purity of the deployment process.
If the entire process of building an artifact is pure, then the artifact would be entirely reproducible, given that you have access to the same inputs. Yes, there are many ways to introduce impurity, however claiming that Nix as a purely functional software distribution model, where the central point is to achieve purity, is fully orthogonal to reproducible builds, seems incorrect.
Hackers screenshot + file system context = Ideal navigation
https://i.imgur.com/dBXdcd9.png