I have created below parameters which supposed to be passed while calling cloudformation client for create SNS stack command.
pubSNSCFParameters = []
pubSNSCFParameters.append("{'ParameterKey': 'Environment','ParameterValue':'" + Constants.Env + "'}")
pubSNSCFParameters.append("{'ParameterKey':'pDisplayName','ParameterValue':'" + SNSStackName + "'}")
pubSNSCFParameters.append("{'ParameterKey':'pTopicName','ParameterValue':'" + SNSStackName + "'}")
which gives like below output:
["{'ParameterKey': 'Environment', 'ParameterValue': 'dev'}", u"{'ParameterKey': 'pDisplayName', 'ParameterValue': 'some-big-value'}", u"{'ParameterKey': 'pTopicName', 'ParameterValue': 'asome-big-value'}"]
now while I run my boto3 client to create the stack for SNS i'm getting
botocore.exceptions.ParamValidationError: Parameter validation failed:
Invalid type for parameter Parameters[0], value: {'ParameterKey': 'Environment', 'ParameterValue': 'dev'}, type: <type 'str'>, valid types: <type 'dict'>
code snippet:
with open(templatelocation + 'CFT_SNS.json', 'r') as f:
client.create_stack(StackName=stackName,
TemplateBody=f.read(),
Parameters=pubSNSCFParameters ,
Capabilities=['CAPABILITY_NAMED_IAM'],
Tags=[
{
'Key': 'CreatorName',
'Value': 'some#email.com'
},
]
)
i would imagine this as to do with datatypes of parameter, so how can I fix it?
Your parameters are string:
"{'ParameterKey': 'Environment', 'ParameterValue': 'dev'}" <-- note quotations at the beginning and end.
This is because you are appending strings to pubSNSCFParameters:
pubSNSCFParameters.append("{'ParameterKey': 'Environment','ParameterValue':'" + Constants.Env + "'}")
It should be dict:
pubSNSCFParameters.append({'ParameterKey': 'Environment','ParameterValue': Constants.Env})
Assuming Constants.Env is string.
Related
This is my api blueprint file:
FORMAT: 1A
# system API
Example
## Clients [/clients]
### Add Client [POST]
+ Request
+ Attributes
+ Include (ClientDetails)
+ Response 200 (application/json)
# Data Structures
## Client (object)
+ name (string, required)
+ phone (string, required) - `E.164 format validated with the regex ^\+[1-9]\d{1,14}$`
+ tax_id (number)
## ClientDetails (object)
+ One Of
+ client (object)
+ Include (Client)
+ save_client: true (boolean)
It renders this Body as an example, which is what I expect:
"client": {
"name": "John Smith",
"phone": "+972504537442",
"tax_id": 515975597,
"save_client": true
}
However it renders the attributes section like this:
I want to use a function to read inputs file paths from a dataframe and send them to my snakemake rule. I also have a helper function to select the remote from which to pull the files.
from snakemake.remote.GS import RemoteProvider as GSRemoteProvider
from snakemake.remote.SFTP import RemoteProvider as SFTPRemoteProvider
from os.path import join
import pandas as pd
configfile: "config.yaml"
units = pd.read_csv(config["units"]).set_index(["library", "unit"], drop=False)
TMP= join('data', 'tmp')
def access_remote(local_path):
""" Connnects to remote as defined in config file"""
provider = config['provider']
if provider == 'GS':
GS = GSRemoteProvider()
remote_path = GS.remote(join("gs://" + config['bucket'], local_path))
elif provider == 'SFTP':
SFTP = SFTPRemoteProvider(
username=config['user'],
private_key=config['ssh_key']
)
remote_path = SFTP.remote(
config['host'] + ":22" + join(base_path, local_path)
)
else:
remote_path = local_path
return remote_path
def get_fastqs(wc):
"""
Get fastq files (units) of a particular library - sample
combination from the unit sheet.
"""
fqs = units.loc[
(units.library == wc.library) &
(units.libtype == wc.libtype),
"fq1"
]
return {
"r1": list(map(access_remote, fqs.fq1.values)),
}
# Combine all fastq files from the same sample / library type combination
rule combine_units:
input: unpack(get_fastqs)
output:
r1 = join(TMP, "reads", "{library}_{libtype}.end1.fq.gz")
threads: 12
run:
shell("cat {i1} > {o1}".format(i1=input['r1'], o1=output['r1']))
My config file contains the bucket name and provider, which are passed to the function. This works as expected when running simply snakemake.
However, I would like to use the kubernetes integration, which requires passing the provider and bucket name in the command line. But when I run:
snakemake -n --kubernetes --default-remote-provider GS --default-remote-prefix bucket-name
I get this error:
ERROR :: MissingInputException in line 19 of Snakefile:
Missing input files for rule combine_units:
bucket-name/['bucket-name/lib1-unit1.end1.fastq.gz', 'bucket-name/lib1-unit2.end1.fastq.gz', 'bucket-name/lib1-unit3.end1.fastq.gz']
The bucket is applied twice (once mapped correctly to each element, and once before the whole list (which gets converted to a string). Did I miss something ? Is there a good way to work around this ?
In Apiary, the cURL call to production by default is :
https://example.com/v1/findBrandCat?matchstring=&interestType=
I have to make a call in following structure:
https://example.com/v1/findBrandCat/matchstringVALUE/interestTypeVALUE
How to make it?
A URI template for the API resource can be defined as follows:
# GET /v1/findBrandCat/{matchstringValue}/{interestTypeValue}
+ Parameters
+ matchstringValue: (required, string)
+ interestTypeValue: (required, string)
+ Response 200 (application/json)
Given:
jstrMap = """
function() {
print("isPointInside = " + isPointInside);
print("polygon = " + polygon);
emit(this._id, this);
}
"""
jstrReduce = """
function(key, values) {
return values[0];
}
"""
def readJSCodeFromFile(filePath):
with open(filePath) as f:
return Code(f.read())
jsIsPointInside = readJSCodeFromFile(path.join(path.dirname(__file__), 'IsPointInside.js'))
IsPointInside.js:
function(pt, poly) {
}
And I invoke map_reduce like this:
mycoll.map_reduce(jstrMap, jstrReduce, 'results',
scope = {'isPointInside': jsIsPointInside, 'polygon': [[-77, 39], [-77,38], [-78,38], [-78,39]]})
Here is what I get on the client console:
db assertion failure, assertion: 'map invoke failed: JS Error: TypeError: isPointInside is not a function nofile_b:3', assertionCode: 9014
And the server output is:
isPointInside = null
polygon = -77,39,-77,38,-78,38,-78,39
Sun Apr 01 16:29:14 [conn11] JS Error: TypeError: isPointInside is not a function nofile_b:3
Sun Apr 01 16:29:14 [conn11] mr failed, removing collection :: caused by :: 9014 map invoke failed: JS Error: TypeError: isPointInside is not a function nofile_b:3
Debugging the python code reveals that jsIsPointInside is of type Code, as expected. str(jsIsPointInside) returns the function text, i.e. 'function(pt, poly) {\n}\n'
I do not want to populate the system.js collection, I'd like to pass the function in the scope. Is it possible at all?
Thanks.
Scope is an object where the fields are placed in the MapReduce scope as variables w/ the field name.
If you'd like to put a function in scope, you need to make it a value a field e.g.
scope = {
myFunc: function() { return "Foo";}
}
I'm trying to use PowerShell with web deployment based on this
article
This is how my script looks like
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Web.Deployment")
function Sync-Provider($provider, $sourceLocation, $destLocation)
{
$destBaseOptions = new-object Microsoft.Web.Deployment.DeploymentBaseOptions
$syncOptions = new-object Microsoft.Web.Deployment.DeploymentSyncOptions
Try
{
$deploymentObject = [Microsoft.Web.Deployment.DeploymentManager]::CreateObject($provider, $sourceLocation)
$deploymentObject.SyncTo($provider,$destLocation,$destBaseOptions,$syncOptions)
}
Catch
{
echo "EXCEPTION THROWN::[ $_ ] "
#throw $_
}
}
Sync-Provider ("apphostConfig","D:\NerdDinner_2.0\NerdDinner","c:\inetpub\wwwroot")
Running this gives the following exception
EXCEPTION THROWN::[ Cannot convert argument "0", with value: "System.Object[]",
for "CreateObject" to type "Microsoft.Web.Deployment.DeploymentWellKnownProvid
er": "Cannot convert value "apphostConfig,D:\NerdDinner_2.0\Ne
rdDinner,c:\inetpub\wwwroot" to type "Microsoft.Web.Deployment.DeploymentWellKn
ownProvider" due to invalid enumeration values. Specify one of the following en
umeration values and try again. The possible enumeration values are "Unknown, A
ppHostConfig, AppHostSchema, AppPoolConfig, ArchiveDir, Auto, Cert, ComObject32
, ComObject64, ContentPath, CreateApp, DirPath, DBFullSql, DBMySql, FilePath, G
acAssembly, IisApp, MachineConfig32, MachineConfig64, Manifest, MetaKey, Packag
e, RecycleApp, RegKey, RegValue, RootWebConfig32, RootWebConfig64, RunCommand,
SetAcl, WebServer, WebServer60"." ]
Could you give me some hints on this, please?
Try to enclose the first parameter [Microsoft.Web.Deployment]::DeploymentWellKnownProvider.AppHostConfig with a pair of extra parenthesis: ([Microsoft.Web.Deployment]::DeploymentWellKnownProvider.AppHostConfig).
In my case I had the same problem, just opened the powershell console as Administrator and it worked.