I'm in the process of migrating from a perl template toolkit and mysql
based config management system to chef.
We have too much to migrate to be able to do it all in one go, so I'd
like a way to push data from one system into the other.
One way I've come up with is a quick recipe that does this:
data_bag_raw = Chef::JSONCompat.from_json(`perl
~syncmaster/synchronizer/templater.pl -r -t chefdatabag.json`)
sam_config_item = data_bag_item("sam-config",node["domain"])
sam_config_item.raw_data = data_bag_raw
sam_config_item.save
the template toolkit generates the chefdatabag.json on the fly from
the mysql database, ends up with about 15 stored procedure calls for
my test stack, but close to 2000 calls for the production stack.
I'm okay with this, but I'm wondering...
Is there a better way?
I don't want every node in the stacks pulling from the mysql database
directly, i want to push the data into a data bag.
should I be using an execute task instead of backticks to call out to
my template library?
just looking for random ideas here...
Thanks!
-Jesse
Archive powered by MHonArc 2.6.16.