how to create object of minioAdmin using minio-py, I want to call admin functionality - kubernetes

I am using minio and want to call admin functions from my client.
we can mc admin CLI for doing admin operation but I want to do the same from my python client.
I have installed minio-py and I found there is minioAdmin class for the same.
Ref: https://github.com/minio/minio-py/blob/master/minio/minioadmin.py
But don't find any example of How one can create an object of minioAdmin class and call its functionality.
please help.

Related

Securing NodeRED dashboard from unwanted access

I'm trying to create some kind of user authentication to prevent unwanted access to my NodeRED's User Interface. I've searched online and found 2 solutions, that for some reason didn't worked out. Here they are:
Tried to add the httpNodeAuth{user:"user", pass:"password"} key to the bluemix-settings.js but after that my dashboard kept prompting me to type username and password, even after I typed the password defined at pass:"password" field.
Added the user defined Environtment Variables NODE_RED_USERNAME : username and NODE_RED_PASSWORD : password . But nothing has changed.
Those solutions were sugested here: How could I prohibit anonymous access to my NodeRed UI Dashboard on IBM Cloud(Bluemix)?
Thanks for the help, guys!
Here is a little bit of the 'bluemix-settings.js'
autoInstallModules: true,
// Move the admin UI
httpAdminRoot: '/red',
// Serve up the welcome page
httpStatic: path.join(__dirname,"public"),
//GUI password authentication (ALEX)
httpNodeAuth: {user:"admin",pass:"$2y$12$W2VkVHvBTwRyGCEV0oDw7OkajzG3mdV3vKRDkbXMgIjDHw0mcotLC"},
functionGlobalContext: { },
// Configure the logging output
logging: {
As described in the Node-RED docs here, you need to add a section as follows to the settings.js (or in the case of Bluemix/IBM Cloud the bluemix-settings.js file.
...
httpNodeAuth: {user:"user",pass:"$2a$08$zZWtXTja0fB1pzD4sHCMyOCMYz2Z6dNbM6tl8sJogENOMcxWV9DN."},
...
The pass files is a bcrypt hash of the password. There are 2 ways listed in the docs about how to generate the hash in the correct way.
if you have a local copy of Node-RED installed you can use the following command:
node-red admin hash-pw
As long as you have a local NodeJS install you can use the following:
node -e "console.log(require('bcryptjs').hashSync(process.argv[1], 8));" your-password-here
You may need to install bcryptjs first with npm install bcryptjs first.

Import test-users into Meteor app with use of npm script?

We can easily create a user from meteor shell like this
Accounts.createUser({username: 'john', password: '12345'})
Similarly, I just want to add multiple users via npm script. Any ideas?
In other words, I want to use fixtures functionality via npm command and not on the initial run.
Thank you.
For normal collections (i.e. different than Meteor.users), you can directly tap into your MongoDB collection. Open a Meteor Mongo shell while your project is running in development mode, then directly type Mongo shell commands.
For Meteor.users collection, you want to leverage the accounts-base and accounts-password packages automatic management, so instead of directly fiddling the MongoDB, you want to insert documents / users through your Meteor app.
Unfortunately, your app source files (like your UsersFixtures.js file) are absolutely not suitable for CLI usage.
The usual solution is to embed a dedicated method within your app server:
// On your server.
// Make sure this Method is not available on production.
// When started with `meteor run`, NODE_ENV will be `development` unless set otherwise previously in your environment variables.
if (process.env.NODE_ENV !== 'production') {
Meteor.methods({
addTestUser(username, password) {
Accounts.createUser({
username,
password // If you do not want to transmit the clear password even in dev environment, you can call the method with 2nd arg: {algorithm: "sha-256", digest: sha256function(password)}
})
}
});
}
Then start your Meteor project in development mode (meteor run), access your app in your browser, open your browser console, and directly call the method from there:
Meteor.call('addTestUser', myUsername, myPassword)
You could also use Accounts.createUser directly in your browser console, but it will automatically log you in as the new user.

how to write e2e test automation for application containing kafka, postgres, and rest api docker containers

I have an app which is setup by docker-compose. The app contains docker containers for kafka, postgres, rest api endpoints.
One test case is to post data to endpoints. In the data, there is a field called callback URL. the app will parse the data and send the data to the callback URL.
I am curious whether there is any test framework for similar test cases. and how to verify the callback URL is hit with data?
Docker compose support has been added to endly. In the pipeline workflow for the app (app.yaml), you can add a "deploy" task and start the docker services by invoking docker-compose up.
Once test task is completed and your callback url is invoked, in your validation task, you can check to see if it was invoked with the expected data. For this you can utilize endly's recording feature and replay it to validate the callback request.
Below is an example of an ETL application app.yaml using docker-compose with endly to start the docker services. Hope it helps.
tasks: $tasks
defaults:
app: $app
version: $version
sdk: $sdk
useRegistry: false
pipeline:
build:
name: Build GBQ ETL
description: Using a endly shared workflow to build
workflow: app/docker/build
origin:
URL: ./../../
credentials: localhost
buildPath: /tmp/go/src/etl/app
secrets:
github: git
commands:
- apt-get -y install git
- export GOPATH=/tmp/go
- export GIT_TERMINAL_PROMPT=1
- cd $buildPath
- go get -u .
- $output:/Username/? ${github.username}
- $output:/Password/? ${github.password}
- export CGO_ENABLED=0
- go build -o $app
- chmod +x $app
download:
/$buildPath/${app}: $releasePath
/$buildPath/startup.sh: $releasePath
/$buildPath/docker-entrypoint.sh: $releasePath
/$buildPath/VERSION: $releasePath
/$buildPath/docker-compose.yaml: $releasePath
deploy:
start:
action: docker:composeUp
target: $target
source:
URL: ${releasePath}docker-compose.yaml
In your question below, where is Kafka involved? Both sound like HTTP calls.
1)Post data to endpoint
2)Against send data to the callback URL
One test case is to post data to endpoints. In the data, there is a field called callback URL. the app will parse the data and send the data to the callback URL.
Assuming the callback URL is an HTTP endpoint(e.g. REST or SOAP) with POST/PUT api, then it's better to expose a GET endpoint on the same resource. In that case, when callback POST/PUT is invoked, the server side state/data changes and next, use the GET api to verify the data is correct. The output of the GET API is the Kafka data which was sent to the callback URL(this assumes your 1st post message was to a kafka topic).
You can achieve this using traditional JUnit way using bit of code or via declarative way where you can completely bypass coding.
The example has dockerized Kafka containers to bring up locally and run the tests
This section Kafka with REST APIs explains automated way of testing combination of REST api testing with Kafka data streams .
e.g.
---
scenarioName: Kafka and REST api validation example
steps:
- name: produce_to_kafka
url: kafka-topic:people-address
operation: PRODUCE
request:
recordType: JSON
records:
- key: id-lon-123
value:
id: id-lon-123
postCode: UK-BA9
verify:
status: Ok
recordMetadata: "$NOT.NULL"
- name: verify_updated_address
url: "/api/v1/addresses/${$.produce_to_kafka.request.records[0].value.id}"
operation: GET
request:
headers:
X-GOVT-API-KEY: top-key-only-known-to-secu-cleared
verify:
status: 200
value:
id: "${$.produce_to_kafka.request.records[0].value.id}"
postCode: "${$.produce_to_kafka.request.records[0].value.postcode}"
Idaithalam is a low code Test automation Framework, developed using Java and Cucumber. It leverages Behavior Driven Development (BDD). Tester can create test cases/scripts in simple Excel with API Spec. Excel is a simplified way to create Json based test scripts in Idaithalam. Test cases can be created quickly and tested in minutes.
As a tester, you need to create Excel and pass it to Idaithalam Framework.
First, generate the Json based test scripts(Virtualan Collection) from Excel. During test execution, this test script collection can be directly utilized.
Then it generates Feature files from the Virtualan Collection and its executed.
Lastly, It generates test report in BDD/Cucumber style.
This provide complete testing support for REST APIs, GraphQL, RDBMS DB and Kafka Event messages
Refer following link for more information to set up and execute.
https://tutorials.virtualan.io/#/Excel
How to create test scripts using excel

vSphere Build Version via API

Is there a way do get the vsphere build versing using any API/SDK/REST?
I know it's possible using powershell on vcenter for that, but it'd be great if there was another option.
Like described here: https://www.virtuallyghetto.com/2017/08/powercli-script-to-help-correlate-vcenter-esxi-vsan-buildversions-wo-manual-vmware-kb-lookup.html
It looks like you should be able to using the VMware vSphere API Python Bindings, as you can just simulate going through the Managed Object Browser.
Parent Managed Object ID: ServiceInstance
Property Path: content.about
And then there is a build string which is what you are looking for.
I figure out how to do this, my need is having as much information about vsphere as possible, so get datacenter, cluster and host details is mantaroy.
For that I used the official ruby api, rbvmomi, but I believe it's exactly the same thing for python one and golang.
It's needed to interact through host folder under root/children object, which is not that clear on wmware api docs, to get it easier follow a piece of code:
vim = RbVmomi::VIM.connect host: host, user: 'user', password: 'pass', insecure: true, debug: false
vim.root.children.each do |root_child|
root_child.hostFolder.children.each do |child|
child.host.each do |host|
prod = host.config.product
puts host.name,
prod.apiType,
prod.apiVersion,
prod.build,
prod.fullName,
prod.instanceUuid,
prod.licenseProductName,
prod.localeBuild,
prod.localeVersion,
prod.name,
prod.osType,
prod.productLineId,
prod.vendor,
prod.version
end
end
end

create a symfony2 service

I am new to Symfony2 and am trying to setup my first service. This is a curl service.
I have followed the directions on the documentation, but haven't been able to get anything to load. I am using version 2.0.1
In my app/config/config.yml I have added:
parameters:
curl_service.class: FTW\GuildBundle\Services\Curl
services:
curl_service:
class: %curl_service.class%
The class file is located in src\FTW\GuildBundle\Services\Curl.php and
it's namespace is namespace FTW\GuildBundle\Services;
The class name is Curl
When I try to load my service with $curl_service = $this->get('curl_service'); the error is get is
You have requested a non-existent service "curl_service".
I think I am missing something very simple... any help would be appreciated!
Make sure you clear your cache at /app/cache/ even in development mode.
It works well on my fresh Symfony 2.0.4 install. Looks like your config is not taken into account. Are you working in dev environment?