[chef] Re: Re: Re: Re: Using Chef offline


Chronological Thread 
  • From: Ranjib Dey < >
  • To: " " < >
  • Subject: [chef] Re: Re: Re: Re: Using Chef offline
  • Date: Tue, 19 May 2015 21:38:58 -0700

since most recipes use packages etc, which internally might invoke network calls , like apt, yum etc, they will not work.
i use server less bootstrap frequently, in similar manner noah has explained, use berks to vendorize the cookbooks, then scp it and run chef-client  in localmode. in some cases i also use fpm to make a debian out of the vendor cookbooks , which significantly reduces the bootstrapping time.

you should consider either setting up local package repos (like yum, or debian repo, gem repos etc) and point your nodes to them, or test individual cookbooks one by one an reduce network dependency, i.e restrict chef resources to a handful or non-network bound resources .. like cookbook_file, template, file, service .. etc. 

you can use container to ease some of these testing as well. like in ubuntu you can stand up a lxc container without network (network type empty) and test your cookbooks.
 

On Tue, May 19, 2015 at 8:21 PM, Noah Kantrowitz < " target="_blank"> > wrote:
No, that is not something you could do in an automated way. If you change you recipe code to use cookbook_file resources instead then you wouldn't have this issue, but you would have to take care of keeping the file in your cookbook up to date somehow. In general you won't find many community cookbooks designed for this, so you'll probably have to mostly write your own. Not being able to do stuff like install packages makes Chef somewhat less useful, so I wouldn't expect most people to accept patches to this end. A better option is probably a local squid caching proxy or similar that can sit between your offline servers and the internet. Chef supports all the standard HTTP proxy stuffs, so easy to plug in to something like that.

--Noah

On May 19, 2015, at 8:16 PM, Tim Leicy < "> > wrote:

> This works fine for getting the cookbooks and the cookbook dependencies themselves, such as the recipes, etc but I am looking for a way to get the files that are specified inside of the cookbooks.
> For example in the opscode boost cookbook: https://github.com/opscode-cookbooks/boost/blob/master/recipes/source.rb there is a call to get a remote file
> remote_file "#{Chef::Config[:file_cache_path]}/#{node['boost']['file']}" do
>   source node['boost']['source'] + node['boost']['file']
>   mode "0644"
>   action :create_if_missing
> end
>
> This makes an http request to sourceforge.
>
> I am looking for a way that these remote files can be 'cached' or downloaded on my internet connected machines so that I can then transfer the data to my non internet connected machines without having to modify anything in the cookbooks themselves I don't think that there is anyway for Berkshelf to do this but I maybe wrong.
>
>
> On Tue, May 19, 2015 at 7:21 PM, Noah Kantrowitz < "> > wrote:
> Use chef-solo or chef-client local mode. Berkshelf can handle downloading the cookbooks and putting them in a format either of those can use (berks vendor).
>
> --Noah
>
> On May 19, 2015, at 7:18 PM, Tim Leicy < "> > wrote:
>
> > Does anyone know if there is a way to have chef pre-download cookbook dependencies from remote urls into a "cache".
> > I have a lot of computers that have no internet access but would like to use chef and community cookbooks to manage them. It is a real pain to manually download the remote files and override the file locations in the cookbooks. I have looked around but really haven't found a good way to manage this, but maybe I missed something. Any suggestions? If there is currently no capability for this, any suggestions on how I should modify chef to incorporate this functionality?
> >
> > Thanks!
> >  Tim
>
>





Archive powered by MHonArc 2.6.16.

§