I often work on my laptop while disconnected from the internet. I use “wget -X” to mirror files while I do have access. For example: This creates the directory structure: +- el +- 6 +- x86_64 +- chef-12.3.0-1.el6.x86_64.rpm Apache on my laptop has the following configuration:
On my vagrant boxes, in /etc/chef/client.rb I set: http_proxy 'http://10.0.2.2:1080' When chef runs, all “external” http requests are routed to and serviced by apache running on my laptop. When the request for http://opscode-omnibus-packages.s3.amazonaws.com/el/6/x86_64/chef-12.3.0-1.el6.x86_64.rpm comes in, apache will check for the existence of /var/mirror/opscode-omnibus-packages.s3.amazonaws.com If that does not exist, a 502 (Bad Gateway) is returned. Otherwise apache will try to serve /var/mirror/opscode-omnibus-packages.s3.amazonaws.com/el/6/x86_64/chef-12.3.0-1.el6.x86_64.rpm If that path does not exist, a 404 is returned. For my own benefit, I like to differentiate between a host I have not mirrored (502) and a path that has not been mirrored (404). As long as the requests are http, this scheme works nicely. For https endpoints, I mirror them with wget as well but I use chef-rewind to change the protocol to http. I’ve just this method for RPMs, jenkins plugins, gems, and github release tar balls. This method should work within an isolated environment as well. Joe On May 19, 2015, at 7:18 PM, Tim Leicy <
">
> wrote:
|
Archive powered by MHonArc 2.6.16.