#469 closed task (fixed)
build pycryptopp+zfec debs for hardy
Reported by: | warner | Owned by: | zooko |
---|---|---|---|
Priority: | critical | Milestone: | 1.3.0 |
Component: | packaging | Version: | 1.1.0 |
Keywords: | Cc: | ||
Launchpad Bug: |
Description (last modified by warner)
We've got a hardy buildslave set up, and it's producing tahoe debs, but these are uninstallable because they depend upon python-pycryptopp and python-zfec, which have not been packaged for hardy yet. It looks like python-pyutil and python-argparse might be required too.
This ticket will be resolved when suitable debs are installed in our APT repository (dev:~buildslave/tahoe-debs). deharo1 is available for building the packages, deharo2 is reserved for testing their installation (deharo2 pulls from the tahoe-debs APT repo).
Change History (6)
comment:1 Changed at 2008-06-18T21:22:27Z by zooko
- Status changed from new to assigned
comment:2 Changed at 2008-07-15T23:08:07Z by zandr
comment:3 Changed at 2008-08-06T19:11:52Z by warner
- Priority changed from major to critical
We hit this again today, trying to set up prodweb3 as an incident-gatherer server.
Zooko, what's your ETA on this one?
comment:4 Changed at 2008-08-06T19:19:44Z by warner
- Description modified (diff)
- Milestone changed from undecided to 1.2.1
it looks like python-pyutil is required too, and python-argparse, at least the gutsy debs I found require them, so we need these packaged too.
comment:5 Changed at 2008-08-06T19:55:41Z by zooko
- Resolution set to fixed
- Status changed from assigned to closed
This has been fixed by cp'ing all the gutsy debs into the hardy repository.
See also the better, long-term fix: #498 (automate the production of .deb's of dependent libraries).
comment:6 Changed at 2008-09-03T01:16:59Z by warner
- Milestone changed from 1.3.1 to 1.3.0
We hit this again today trying to test our tahoe-server debs on deharo2.
As it stands now, Tahoe is not deployable by ops on Hardy.
This would be blocking deployment of new monitoring, if I hadn't already worked around this once.