Jeremy Stanley wrote:
[...] For artifacts we upload to third-party services like PyPI and Docker Hub on the other hand, assuming I've digested (pun intended) the relevant literature correctly, it might make more sense for the maintainers of those services to do something similar as they tend to perform a fair amount of URL indirection and so trying to keep up historical data for those URLs ourselves could be tricky. On the other hand if those third-party services were to integrate rget updating as part of their infrastructure it would be a lot more seamless (especially if they similarly integrated CT checks into the corresponding client-side tooling).
Another challenge I see is that, due to the fact that most of what we host is source code, and most consumers of our source code are obtaining it via Git rather than release artifacts, rget wouldn't really do much for them as far as I can see... though once Git completes its planned transition to SHA2-256 in the coming years, I could see a call for some solution to publish known object hashes to a CT log in a similar fashion. I suppose it could be done now by re-checksumming all content over a Git object and submitting a certificate for that, but it seems a bit heavy-weight and I'll admit to not having thought through it completely so there are likely hidden gotchas with that idea.
I agree with Jeremy, it seems to cover a limited amount of use cases (people who download tarball source releases from releases.o.o). But covering only a few use cases is not a reason not to do it: we should support it for the same reason we provide signatures for released artifacts right now. Furthermore it is an initiative I'm fine being early adopters of this idea, if only so that one day we may find it covering other ways to retrieve our deliverables (pypi, git). -- Thierry Carrez (ttx)