Chef MongoDB Replication with sc-mongodb - mongodb

I am new to chef and I'm using sc-mongodb, and I can't get this to work. Is there a better way of doing replication for MongoDB with chef?
I was able to get the default recipe working
include_recipe "sc-mongodb::default"
But when I tried to do replication for mongo, I started getting some weird errors.
include_recipe "sc-mongodb::replicaset"
Error:
================================================================================
Recipe Compile Error in /tmp/kitchen/cache/cookbooks/c_mongo/recipes/default.rb
================================================================================
Net::HTTPServerException
------------------------
400 "Bad Request"
Cookbook Trace:
---------------
/tmp/kitchen/cache/cookbooks/sc-mongodb/definitions/mongodb.rb:236:in `block in from_file'
/tmp/kitchen/cache/cookbooks/sc-mongodb/recipes/replicaset.rb:36:in `from_file'
/tmp/kitchen/cache/cookbooks/c_mongo/recipes/default.rb:54:in `from_file'
Relevant File Content:
----------------------
/tmp/kitchen/cache/cookbooks/sc-mongodb/definitions/mongodb.rb:
229: notifies :run, 'ruby_block[config_sharding]', :immediately if new_resource.is_mongos && new_resource.auto_configure_sharding
230: # we don't care about a running mongodb service in these cases, all we need is stopping it
231: ignore_failure true if new_resource.name == 'mongodb'
232: end
233:
234: # replicaset
235: if new_resource.is_replicaset && new_resource.auto_configure_replicaset
236>> rs_nodes = search(
237: :node,
238: "mongodb_cluster_name:#{new_resource.cluster_name} AND "\
239: 'mongodb_is_replicaset:true AND '\
240: "mongodb_config_mongod_replication_replSetName:#{new_resource.replicaset_name} AND "\
241: "chef_environment:#{node.chef_environment}"
242: )
243:
244: ruby_block 'config_replicaset' do
245: block do
System Info:
------------
chef_version=13.8.5
platform=centos
platform_version=7.4.1708
ruby=ruby 2.4.3p205 (2017-12-14 revision 61247) [x86_64-linux]
program_name=chef-client worker: ppid=28997;start=00:31:33;
executable=/opt/chef/bin/chef-client
Running handlers:
[2018-03-27T00:31:35+00:00] ERROR: Running exception handlers
Running handlers complete
[2018-03-27T00:31:35+00:00] ERROR: Exception handlers complete
Chef Client failed. 0 resources updated in 01 seconds
[2018-03-27T00:31:35+00:00] FATAL: Stacktrace dumped to /tmp/kitchen/cache/chef-stacktrace.out
[2018-03-27T00:31:35+00:00] FATAL: Please provide the contents of the stacktrace.out file if you file a bug report
[2018-03-27T00:31:35+00:00] ERROR: 400 "Bad Request"
[2018-03-27T00:31:35+00:00] FATAL: Chef::Exceptions::ChildConvergeError: Chef run process exited unsuccessfully (exit code 1)
I have tried so many ways to resolve this problem, looking at the issues from the github repository. From the errors, it looks like the attributes aren't getting set, so people are setting them manually. :
# attempt1------------
#node.default['mongodb']['config']['replSet'] = true
#node.default[:mongodb][:cluster_name] = "repl-name"
#include_recipe "sc-mongodb::replicaset"
# attempt2----------
#node.normal['mongodb']['install_method'] = 'mongodb-org'
#node.normal['mongodb']['config']['bind_ip'] = '0.0.0.0'
#node.normal['mongodb']['dbconfig_file'] = '/etc/mongod.conf'
#node.normal['mongodb']['config']['replSet'] = true
#node.normal['mongodb']['is_replicaset'] = true
#node.normal['mongodb']['cluster_name'] = 'scribe'
#node.normal['mongodb']['replSet'] = 'scribe'
#node.normal['mongodb']['is_shard'] = false
#include_recipe "sc-mongodb::replicaset"
#attempt3------------
#node.default[:mongodb][:cluster_name] = "cluster_name"
#include_recipe "sc-mongodb::replicaset"
#attempt4------------
#if node['mongodb']['config']['replSet'].nil?
# node.default['mongodb']['config']['replSet'] = "repl-name"
#end
#include_recipe "sc-mongodb::replicaset"
#attempt5-------------
#https://github.com/sous-chefs/mongodb/issues/167
#node.default['mongodb']['config']['mongod']['replication']['replSetName'] = "rs-name"
#include_recipe "sc-mongodb::replicaset"
This one gives me a different error:
#attempt6-----------
node.default['mongodb']['config']['mongod']['replication']['replSetName']= 'rs_default'
node.default['mongodb']['cluster_name'] = 'cluster'
node.default['mongodb']['auto_configure']['replicaset'] = true
include_recipe "sc-mongodb::replicaset"
Stacktrace:
================================================================================
Error executing action `run` on resource 'ruby_block[config_replicaset]'
================================================================================
NoMethodError
-------------
undefined method `[]' for nil:NilClass
Cookbook Trace:
---------------
/tmp/kitchen/cache/cookbooks/sc-mongodb/libraries/mongodb.rb:74:in `configure_replicaset'
/tmp/kitchen/cache/cookbooks/sc-mongodb/definitions/mongodb.rb:246:in `block (3 levels) in from_file'
Resource Declaration:
---------------------
# In /tmp/kitchen/cache/cookbooks/sc-mongodb/definitions/mongodb.rb
244: ruby_block 'config_replicaset' do
245: block do
246: MongoDB.configure_replicaset(node, replicaset_name, rs_nodes) unless new_resource.replicaset.nil?
247: end
248: action :nothing
249: end
250:
Compiled Resource:
------------------
# Declared in /tmp/kitchen/cache/cookbooks/sc-mongodb/definitions/mongodb.rb:244:in `block in from_file'
ruby_block("config_replicaset") do
params {:mongodb_type=>"mongod", :action=>[:enable, :start], :logpath=>"/var/log/mongodb/mongod.log", :configservers=>[], :replicaset=>true, :notifies=>[], :not_if=>[], :name=>"mongod"}
action [:nothing]
retries 0
retry_delay 2
default_guard_interpreter :default
block_name "config_replicaset"
declared_type :ruby_block
cookbook_name "sc-mongodb"
recipe_name "replicaset"
block #<Proc:0x00000003ebdec8#/tmp/kitchen/cache/cookbooks/sc-mongodb/definitions/mongodb.rb:245>
end
Platform:
---------
x86_64-linux

I've had a lot of trouble with this cookbook, you're not alone.
From what I've gathered, you need to run this cookbook multiple times and/or in different configurations depending on what you are trying to achieve or what state your node is in. For example, I believe the auto_configure attribute should only be set for the last node in the set after the others have been cheffed with that set to false. Similarly for their user recipe, mongodb only allows admin coll operations on the primary and so you should ensure this recipe is executed on the designated primary node.
Unfortunately the documentation is not clear and for someone like me new to Chef and Ruby, the src and errors are tricky to interpret. I am still in the process of figuring out this cookbook and can report back if I have something concrete to give you, have you been able to get this working since your post? Sorry I can't be of much more help, you will have to try configurations out with test-kitchen VMs.

Related

Error executing action `create` on resource 'cookbook_file

Getting an error while executing action 'create' on a resource. I am running the recipe in --local-mode not sure if that is the problem.
I am just trying to run it locally first instead of running it on a node.
Pasting my file and my output in here :
[2019-04-09T19:25:51+00:00] WARN: Node rheaj has an empty run list.
Converging 4 resources
Recipe: #recipe_files::/home/rheaj/chef/cookbooks/oc_jumpbox/recipes/default.rb
* cookbook_file[/etc/motd] action create[2019-04-09T19:25:51+00:00] INFO: Processing cookbook_file[/etc/motd] action create (#recipe_files::/home/rheaj/chef/cookbooks/oc_jumpbox/recipes/default.rb line 9)
================================================================================
Error executing action `create` on resource 'cookbook_file[/etc/motd]'
================================================================================
Chef::Exceptions::CookbookNotFound
----------------------------------
Cookbook #recipe_files not found. If you're loading #recipe_files from another cookbook, make sure you configure the dependency in your metadata
Resource Declaration:
---------------------
# In /home/rheaj/chef/cookbooks/oc_jumpbox/recipes/default.rb
9: cookbook_file '/etc/motd' do
10: group 'root'
11: user 'root'
12: mode '0644'
13: source 'motd'
14: end
15:
Default location for your cookbook_files is in files/default. The error you get is that your cookbook file is not there. Can you create that file files/default/motd and re run your chef-client.

Trying to create sample linux mage with yocto prject but cause building error

I tried to create a Linux image based on "yocto project mega Manuel".But I got an error in building image step.
I followed the mega Manuel.
I'm using Ubuntu 18.04.1 LTS.
Error:
aju#aju-HP-15-Notebook-PC:~/poky/build$ bitbake core-image-sato
WARNING: Host distribution "Ubuntu-18.04" has not been validated with this version of the build system; you may possibly experience
unexpected failures. It is recommended that you use a tested
distribution.
Parsing recipes: 100% |#########################################| Time: 00:00:49
Parsing of 899 .bb files complete (0 cached, 899 parsed). 1330 targets, 38 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
Build Configuration: BB_VERSION = "1.28.0" BUILD_SYS = "x86_64-linux" NATIVELSBSTRING = "Ubuntu-18.04" TARGET_SYS = "i586-poky-linux" MACHINE = "qemux86" DISTRO = "poky" DISTRO_VERSION = "2.0.3" TUNE_FEATURES = "m32 i586"
TARGET_FPU = "" meta meta-yocto
meta-yocto-bsp = "jethro:331275422b2c3f326f605c23ae89eedb4e222eb5"
NOTE: Preparing RunQueue NOTE: Executing SetScene Tasks NOTE:
Executing RunQueue Tasks ERROR: oe_runmake failed ERROR: Function
failed: do_compile (log file is located at
/home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/temp/log.do_compile.301)
ERROR: Logfile of failure stored in:
/home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/temp/log.do_compile.301
Log data follows: | DEBUG: Executing shell function do_compile | NOTE:
make -j 4 | : && /bin/mkdir -p doc && {
PATH='/home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/build/t/wrap:'$PATH
&& export PATH; } && /usr/bin/perl
/home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/automake-1.15/doc/help2man
--output=doc/automake-1.15.1 automake-1.15
| help2man: can't get --help' info from automake-1.15
| Try--no-discard-stderr' if option outputs to stderr
| Makefile:3687: recipe for target 'doc/automake-1.15.1' failed
| make: *** [doc/automake-1.15.1] Error 255
| WARNING: exit code 1 from a shell command.
| ERROR: oe_runmake failed
| ERROR: Function failed: do_compile (log file is located at /home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/temp/log.do_compile.301)
ERROR: Task 403 (virtual:native:/home/aju/poky/meta/recipes-devtools/automake/automake_1.15.bb,
do_compile) failed with exit code '1'
NOTE: Tasks Summary: Attempted 73 tasks of which 53 didn't need to be rerun and 1 failed.
Waiting for 0 running tasks to finish:
Summary: 1 task failed: virtual:native:/home/aju/poky/meta/recipes-devtools/automake/automake_1.15.bb,
do_compile Summary: There was 1 WARNING message shown. Summary: There
were 2 ERROR messages shown, returning a non-zero exit code.
Is it the problem with this latest version or is it something else?
Why are you using such an old release of Yocto? 2.0.x was first released in 2015 and isn't supported on modern distributions. If you need to use 2.0.x then you can pick a patch from a recent release to fix autoconf, but I really do recommend using 2.5 (or 2.6, due to release any day now) instead.

TAP results do not show up during execution time, if a TestClassSetup is present

I have a problem. We use the Matlab testing framework to analyze our codebase. To track the results in our CI system TeamCity we use the TAP-format. Here we have the following problem:
If a test includes a TestClassSetup section, the TAP results show up only at the end, and not during the exection. This results in a few issues for us:
Timestamps created by the CI system might not be correct
If informative output is given within a test-case, it is not shown together with the assertion statment.
We use the following (simplified) snippet to identify out TestSuite and execute it:
testSuite = matlab.unittest.TestSuite.fromFolder('.');
runner = matlab.unittest.TestRunner.withNoPlugins();
runner.addPlugin(matlab.unittest.plugins.TAPPlugin.producingOriginalFormat());
results = runner.run(testSuite);
With the following two classes the issue is reproducible (the content is of course made up & meaningless...):
classdef SomeTest < matlab.unittest.TestCase
properties (TestParameter)
param = {1, 2};
param2 = {1, 2};
end
methods (TestClassSetup)
function someSetup(testCase)
pause(0.1);
end
end
methods (Test)
function testMethod(self, param, param2)
fprintf('I''m here, with the params: %f/%f\n', param, param2);
pause(0.1);
self.assertGreaterThan(param, param2);
end
end
end
classdef SomeOtherTest < matlab.unittest.TestCase
properties (TestParameter)
param = {1, 2};
param2 = {1, 2};
end
methods (Test)
function testMethod(self, param, param2)
fprintf('I''m here, with the params: %f/%f\n', param, param2);
pause(0.1);
self.assertGreaterThan(param, param2);
end
end
end
If you copy all three files into one folder, and execute the runner, you'll see the output (assertions are simplified):
1..8
I'm here, with the params: 1.000000/1.000000
not ok 1 - SomeOtherTest/testMethod(param=1,param2=1)
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=1,param2=1) and it did not run to completion.
# ================================================================================
#
I'm here, with the params: 1.000000/2.000000
not ok 2 - SomeOtherTest/testMethod(param=1,param2=2)
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=1,param2=2) and it did not run to completion.
# ================================================================================
#
I'm here, with the params: 2.000000/1.000000
ok 3 - SomeOtherTest/testMethod(param=2,param2=1)
I'm here, with the params: 2.000000/2.000000
not ok 4 - SomeOtherTest/testMethod(param=2,param2=2)
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=2,param2=2) and it did not run to completion.
# ================================================================================
#
I'm here, with the params: 1.000000/1.000000
I'm here, with the params: 1.000000/2.000000
I'm here, with the params: 2.000000/1.000000
I'm here, with the params: 2.000000/2.000000
not ok 5 - SomeTest/testMethod(param=1,param2=1)
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=1,param2=1) and it did not run to completion.
# ================================================================================
#
not ok 6 - SomeTest/testMethod(param=1,param2=2)
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=1,param2=2) and it did not run to completion.
# ================================================================================
#
ok 7 - SomeTest/testMethod(param=2,param2=1)
not ok 8 - SomeTest/testMethod(param=2,param2=2)
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=2,param2=2) and it did not run to completion.
# ================================================================================
What I would expect is that also in the second case the Assertion statements (and the ok / not ok TAP flags) are aligned with the fprintf-statements.
Has anyone an idea?
The reason the presence of TestClassSetup "defers" the printing of the TAP output is because the TAP output is a streaming format and if there is any TestClassSetup code the frame work actually does not yet know whether the tests will pass or not. For example, if you have a failure in TestClassTeardown (or through an addTeardown function call in TestClassSetup), the end result is that all the tests that shared the TestClassSetup code will fail.
However, given its a streaming format the TAPPLugin wants to print out the result as soon as it knows the result. There is actually a TestRunnerPlugin method specifically designed for this case, the reportFinalizedResult method.
The fundamental issue here is that I would recommend you avoid printing to the log using disp or fprintf. This is less ideal because the plugins don't have any insight into any of the information printed using fprintf. Also, you can't redirect this information anywhere other than the matlab command line.
However, if you instead using the testCase.log method you will get the diagnostics in the right place and it will be more flexible. You will be able to log it at different levels so you can turn it on or off as you please and control whether you want to see it. It will also not only go to the command line but will go much more nicely into the TAP stream as well as the junit xml and the pdf/html test reports and so on. For your case it looks like the following:
runner = matlab.unittest.TestRunner.withNoPlugins();
runner.addPlugin(matlab.unittest.plugins.TAPPlugin.producingOriginalFormat());
results = runner.run(testSuite);
First you run and you don't see any of the log calls because it was logged at verbosity "3" and the default is lower (level 1 I believe)
1..8
not ok 1 - SomeOtherTest/testMethod(param=value1,param2=value1)
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=value1,param2=value1) and it did not run to completion.
# ================================================================================
not ok 2 - SomeOtherTest/testMethod(param=value1,param2=value2)
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=value1,param2=value2) and it did not run to completion.
# ================================================================================
ok 3 - SomeOtherTest/testMethod(param=value2,param2=value1)
not ok 4 - SomeOtherTest/testMethod(param=value2,param2=value2)
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=value2,param2=value2) and it did not run to completion.
# ================================================================================
not ok 5 - SomeTest/testMethod(param=value1,param2=value1)
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=value1,param2=value1) and it did not run to completion.
# ================================================================================
not ok 6 - SomeTest/testMethod(param=value1,param2=value2)
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=value1,param2=value2) and it did not run to completion.
# ================================================================================
ok 7 - SomeTest/testMethod(param=value2,param2=value1)
not ok 8 - SomeTest/testMethod(param=value2,param2=value2)
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=value2,param2=value2) and it did not run to completion.
# ================================================================================
However, if you configure the tap plugin (or the version 13 tap plugin or the report plugin etc) to log at level threee then you see these diagnostics and they are at the expected location as well:
runner = matlab.unittest.TestRunner.withNoPlugins();
runner.addPlugin(matlab.unittest.plugins.TAPPlugin.producingOriginalFormat('Verbosity', 3));
results = runner.run(testSuite);
You see the output. Also try the TAPVersion 13, the structured output that provides might provide an even better result.
1..8
not ok 1 - SomeOtherTest/testMethod(param=value1,param2=value1)
# ================================================================================
# [Detailed] Diagnostic logged (2018-08-09 16:47:18): I'm here, with the params: 1.000000/1.000000
# ================================================================================
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=value1,param2=value1) and it did not run to completion.
# ================================================================================
not ok 2 - SomeOtherTest/testMethod(param=value1,param2=value2)
# ================================================================================
# [Detailed] Diagnostic logged (2018-08-09 16:47:19): I'm here, with the params: 1.000000/2.000000
# ================================================================================
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=value1,param2=value2) and it did not run to completion.
# ================================================================================
ok 3 - SomeOtherTest/testMethod(param=value2,param2=value1)
# ================================================================================
# [Detailed] Diagnostic logged (2018-08-09 16:47:19): I'm here, with the params: 2.000000/1.000000
# ================================================================================
not ok 4 - SomeOtherTest/testMethod(param=value2,param2=value2)
# ================================================================================
# [Detailed] Diagnostic logged (2018-08-09 16:47:19): I'm here, with the params: 2.000000/2.000000
# ================================================================================
# ================================================================================
# Assertion failed in SomeOtherTest/testMethod(param=value2,param2=value2) and it did not run to completion.
# ================================================================================
not ok 5 - SomeTest/testMethod(param=value1,param2=value1)
# ================================================================================
# [Detailed] Diagnostic logged (2018-08-09 16:47:19): I'm here, with the params: 1.000000/1.000000
# ================================================================================
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=value1,param2=value1) and it did not run to completion.
# ================================================================================
not ok 6 - SomeTest/testMethod(param=value1,param2=value2)
# ================================================================================
# [Detailed] Diagnostic logged (2018-08-09 16:47:19): I'm here, with the params: 1.000000/2.000000
# ================================================================================
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=value1,param2=value2) and it did not run to completion.
# ================================================================================
ok 7 - SomeTest/testMethod(param=value2,param2=value1)
# ================================================================================
# [Detailed] Diagnostic logged (2018-08-09 16:47:20): I'm here, with the params: 2.000000/1.000000
# ================================================================================
not ok 8 - SomeTest/testMethod(param=value2,param2=value2)
# ================================================================================
# [Detailed] Diagnostic logged (2018-08-09 16:47:20): I'm here, with the params: 2.000000/2.000000
# ================================================================================
# ================================================================================
# Assertion failed in SomeTest/testMethod(param=value2,param2=value2) and it did not run to completion.
# ================================================================================
Hope that helps!

http 403 forbidden on deployment

We are facing http 403 forbidden when we are trying to create a file in the VM. We are using the following chef code to do the job.
X_FILES = %w{x-log4j.properties x-override.properties x-core.properties x.conf}
x_FILES.each do |file|
template "/etc/b2b/x/#{file}" do
mode 0644
source "#{file}.erb"
owner 'root'
action :create
notifies :restart, "service[a2a-x]"
only_if { File.exist?("/etc/a2a/x") }
helpers MongoUtil
helpers xUtil
end
end
We have multiple VMs and the error is not consistent. On restart of deployment job, the process is going fine.
The error log is as follows
[2015-10-01T01:40:03+05:30] INFO: Processing template[/etc/b2b/x/x-log4j.properties] action create (b2b::install_configure_x line 89)
[2015-10-01T01:40:04+05:30] INFO: HTTP Request Returned 403 Forbidden:
================================================================================
Error executing action `create` on resource 'template[/etc/b2b/x/x-log4j.properties]'
================================================================================
Net::HTTPServerException
------------------------
403 "Forbidden"
Resource Declaration:
---------------------
# In /var/chef/cache/cookbooks/b2b/recipes/install_configure_x.rb
89: template "/etc/b2b/x/#{file}" do
90: mode 0644
91: source "#{file}.erb"
92: owner 'root'
93: action :create
94: notifies :restart, "service[b2b-x]"
95: only_if { File.exist?("/etc/b2b/x") }
96: helpers MongoUtil
97: helpers TenantIdUtil
98: end
99: end
Compiled Resource:
------------------
# Declared in /var/chef/cache/cookbooks/b2b/recipes/install_configure_x.rb:89:in `block in from_file'
template("/etc/b2b/x/x-log4j.properties") do
provider Chef::Provider::Template
action [:create]
retries 0
retry_delay 2
guard_interpreter :default
path "/etc/b2b/x/x-log4j.properties"
backup 5
atomic_update true
source "x-log4j.properties.erb"
helper_modules [MongoUtil, TenantIdUtil]
cookbook_name "b2b"
recipe_name "install_configure_x"
mode 420
owner "root"
only_if { #code block }
end
Any help in this regard is greatly appreciated. thanks in advance.
In case someone else comes across this, as I did, this is most likely due to the authentication token timing out due to a very long chef run.
Here's a page that details solutions to that problem:
https://discourse.chef.io/t/seeing-a-weird-403-error-while-running-recipe/6149/2

Chef install of readline package fails on Ubuntu 14.04

I am trying get PostgreSQL (server) installed on an Ubuntu node using Chef:
Role definition (roles/base_server.rb):
run_list(
"recipe[apt]",
"recipe[postgres::server]"
)
default_attributes(
postgresql: {
version: "9.3.4",
config: {
shared_buffers_mb: "12000"
}
}
Setup
System: Ubuntu 14.04 LTS (GNU/Linux 3.13.0-24-generic x86_64)
Chef-Version: 11.14.6
Postgres Cookbook: [3.4.1] (https://github.com/hw-cookbooks/postgresql)
Running the bootstrap command
knife bootstrap IPADDRESS -x USER -r 'role[base_server]' --sudo
Results in the following error:
* package[readline] action install
* No version specified, and no candidate version available for readline
================================================================================
Error executing action `install` on resource 'package[readline]'
================================================================================
Chef::Exceptions::Package
-------------------------
No version specified, and no candidate version available for readline
Resource Declaration:
---------------------
# In /var/chef/cache/cookbooks/postgres/recipes/build.rb
29: package package_name do
30: action :install
31: end
32: end
Compiled Resource:
------------------
# Declared in /var/chef/cache/cookbooks/postgres/recipes/build.rb:29:in `block in from_file'
package("readline") do
action [:install]
retries 0
retry_delay 2
guard_interpreter :default
package_name "readline"
timeout 900
cookbook_name "postgres"
recipe_name "build"
end
I already tried to fix this by installing the readline libraries manually, but no success. Has anyone an idea how to solve this?
As mentioned by StephenKing, the issue isn't with Chef, it is that the package really doesn't exist. You'll need to fix your recipe code to use the correct package name.