That's pretty interesting. I wanted to avoid standing up a dedicated
yum server for our handful of packages so I stuck them in a yum like
structure in s3 and they get synced down to the clients using s3cmd
sync as part of recipes (as well as a daily cronjob)
https://gist.github.com/866206
This installs the base repo files and does some cleanup on repos that
were added before I got here.
So basically each server has a copy of our packages on the EC2 /mnt
volume. I just build them locally in a VM for each arch.
On Fri, Mar 11, 2011 at 11:57 AM, Charles Duffy < "> > wrote:
> I have a "cookbook_rpms" recipe I use for the purpose. Note that, as it uses
> yum localinstall, it requires that your packages need to be signed (but this
> is trivial -- you can just do find . -name '*.rpm' -exec rpm --addsign {} +
> to sign everything in your repo, once you've set up a key pair).
> See https://github.com/Tippr/tippr-public-cookbooks/tree/master/cookbook_rpms
>
> On Fri, Mar 11, 2011 at 7:36 AM, < "> > wrote:
>>
>> Hello,
>>
>> I've just started evaluating Chef - so this may be something that's
>> already
>> covered in the documentation. However, I've not been able to find it - so
>> asking here.
>>
>> I've a set of custom software packages (let's say that can be downloaded
>> from a
>> certain location on Amazon S3) which I want to install as part of my
>> automated
>> machine setup. These are not available as part of any Linux distro's
>> standard
>> repository, so apt-get/yum won't work.
>>
>> What is the fastest way to get this working? As I see it, it boils down to
>> running a custom script. How easy or how hard is it to do in Chef?
>>
>> - H
>
>
Archive powered by MHonArc 2.6.16.