0
how to get ipaddress values 10.223.128.154, 127.0.0.1, fe80::2888:420:810f:caf9, ::1 from [{'ipAddress': '10.223.128.154', 'macAddress': 'A4AE122BE234', 'type': 'Ethernet', 'operationalStatus': 'Up'}, {'ipAddress': 'fe80::2888:420:810f:caf9', 'macAddress': 'A4AE122BE234', 'type': 'Ethernet', 'operationalStatus': 'Up'}, {'ipAddress': '127.0.0.1', 'macAddress': None, 'type': 'SoftwareLoopback', 'operationalStatus': 'Up'}, {'ipAddress': '::1', 'macAddress': None, 'type': 'SoftwareLoopback', 'operationalStatus': 'Up'}] using hive
answer for above query
Related
I try to create a service account via kubernetes client of Python, but in the post return the dict with the manifest, but do not create the service account.
My code is :
import kubernetes.client
from kubernetes import client, config
from kubernetes.client.rest import ApiException
from pprint import pprint
from kubernetes import config
config.load_kube_config()
client.configuration.debug = True
v1 = client.CoreV1Api()
# create an instance of the API class
namespace = 'users' # str | object name and auth scope, such as for teams and projects
body = {'metadata': {'name': 'test.david'} }
pretty = 'true'
dry_run = 'All' # str | When present, indicates that modifications should not be persisted.
An invalid or unrecognized dryRun directive will result in an error response and no further
processing of the request. Valid values are: - All: all dry run stages will be processed
(optional)
try:
api_response = v1.create_namespaced_service_account(namespace,body,dry_run=dry_run,
pretty=pretty)
pprint(api_response)
except ApiException as e:
print("Exception when calling CoreV1Api->create_namespaced_service_account: %s\n" % e)
The response :
{'api_version': 'v1',
'automount_service_account_token': None,
'image_pull_secrets': None,
'kind': 'ServiceAccount',
'metadata': {'annotations': None,
'cluster_name': None,
'creation_timestamp': datetime.datetime(2020, 5, 25, 23, 30, 26, tzinfo=tzutc()),
'deletion_grace_period_seconds': None,
'deletion_timestamp': None,
'finalizers': None,
'generate_name': None,
'generation': None,
'initializers': None,
'labels': None,
'managed_fields': None,
'name': 'test.david',
'namespace': 'users',
'owner_references': None,
'resource_version': None,
'self_link': '/api/v1/namespaces/users/serviceaccounts/test.david',
'uid': 'b64cff7c-9edf-11ea-8b22-0a714f906f03'},
'secrets': None}
what am I doing wrong?
Need to set
dry_run = ''
As when dry_run present, indicates that modifications should not be persisted.
I am new to AWS Glue, and have created job through crawler which point the source target CSV-file in S3-bucket.
The CSV-file contains following columns as:
userId jobTitleName firstName lastName preferredFullName employeeCode region
Now during the job execution, it throws following error
Key error: userid' not exist. As notified, the issue looks the case sensitive issue. so as per the glue-document, I created mapping for the schema
mappingsSchema=[('userid', 'integer', 'userId', 'integer'),
('jobtitlename', 'string', 'jobTitleName', 'string'),
('firstname', 'string', 'firstName', 'string'),
('lastname', 'string', 'lastName', 'string'),
('preferredfullName', 'string', 'preferredFullname', 'string'),
('employeecode', 'string', 'employeeCode', 'string'),
('region', 'string','region', 'string')]
mapped_dynamic_frame_read=dynamic_frame_read.apply_mapping(mappings = mappingsSchema, case_sensitive = True, transformation_ctx = "tfx")
##And converting to the spark df
df = mapped_dynamic_frame_read.toDF()
Still I'mm getting the same mentioned error.
How can resolve this kind of issue?
Hi #Emerson The issue was with mappings in which column names were wrongly specified against the schema definition. Now it's fixed and working fine.. Thanks
I have a custom action plugin and I need to write out returned variable data on the controller to a file. I'm trying this locally right now.
copy_module_args = dict()
copy_module_args["content"] = 'test'
copy_module_args["dest"] = dest
copy_module_args["owner"] = owner
copy_module_args["group"] = group
copy_module_args["mode"] = mode
try:
result = merge_hash(result, self._execute_module(
module_name="copy",
module_args=copy_module_args,
task_vars=task_vars))
except (AnsibleError, TypeError) as err:
err_msg = "Failed to do stuff"
raise AnsibleActionFail(to_text(err_msg), to_text(err))
The result of ._execute_module is
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Source None not found"}
The vaule of result is
{'msg': 'Source None not found', 'failed': True, 'invocation': {'module_args': {'content': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'dest': '/home/me/testfile', 'owner': 'me', 'group': 'me', 'mode': None, 'backup': False, 'force': True, 'follow': False, 'src': None, '_original_basename': None, 'validate': None, 'directory_mode': None, 'remote_src': None, 'local_follow': None, 'checksum': None, 'seuser': None, 'serole': None, 'selevel': None, 'setype': None, 'attributes': None, 'regexp': None, 'delimiter': None, 'unsafe_writes': None}}, '_ansible_parsed': True}
This invocation is trying to use the "src" param even though I'm only passing the "content" param. I know this because when I add "src" the failure message changes. I excepted, from the docs and from reading the copy module and template module source that at a bare minimum my implementation would result in:
- name: Copy using inline content
copy:
content: 'test'
dest: /home/me/testfile
Does anyone know what I'm missing or why "src" is being preferred over "content" even though it's not being specified?
The content: argument is just syntatic sugar for writing it to a tempfile, so I would guess you will need to take charge of that, or find a way to invoke the copy action, which apparently runs before the copy module.
I was able to see that "content" was being handled in the action plugin, not the module. I've adapted what I found to fit my needs. I call the action plugin, instead of the module directly.
copy_module_args = dict()
copy_module_args["content"] = 'test'
copy_module_args["dest"] = dest
copy_module_args["owner"] = owner
copy_module_args["group"] = group
copy_module_args["mode"] = mode
copy_module_args["follow"] = True
copy_module_args["force"] = False
copy_action = self._task.copy()
copy_action.args.update(copy_module_args)
# Removing args passed in via the playbook that aren't meant for
# the copy module
for remove in ("arg1", "arg2", "arg3", "arg4"):
copy_action.args.pop(remove, None)
try:
copy_action = self._shared_loader_obj.action_loader.get('copy',
task=copy_action,
connection=self._connection,
play_context=self._play_context,
loader=self._loader,
templar=self._templar,
shared_loader_obj=self._shared_loader_obj)
result = merge_hash(result, copy_action.run(task_vars=task_vars))
This allows me to leverage copy how I originally intended, by utilising its idempotency and checksumming without having to write my own.
changed: [localhost] => {"changed": true, "checksum": "00830d74b4975d59049f6e0e7ce551477a3d9425", "dest": "/home/me/testfile", "gid": 1617705057, "group": "me", "md5sum": "6f007f4188a0d35835f4bb84a2548b66", "mode": "0644", "owner": "me", "size": 9, "src": "/home/me/.ansible/tmp/ansible-tmp-1560715301.737494-249856394953357/source", "state": "file", "uid": 1300225668}
And running it again,
ok: [localhost] => {"changed": false, "dest": "/home/me/testfile", "src": "/home/me/testfile/.ansible/tmp/ansible-local-9531902t7jt3/tmp_nq34zm5"}
I want to check whether a role exists in a mongodb, before I create a new one . I tried to do it the following way:
result = self.client[database].command("getRole", name=app_name)
Unfortunately I get the following error:
msg = msg or "%s"
raise OperationFailure(msg % errmsg, code, response)
pymongo.errors.OperationFailure: no such command: 'getRole', bad cmd: '{ getRole: 1, name: "test" }'
I am referring to this database command: https://docs.mongodb.com/manual/reference/method/db.getRole/
For createRole I can execute the command: https://docs.mongodb.com/manual/reference/method/db.createRole/#db.createRole
Shell methods db.* are different from Database commands.
Using the roleInfo command you can get information for a particular role.
db.command({
'rolesInfo': {'role': 'noremove','db': 'test'},
'showPrivileges': True, 'showBuiltinRoles': True
})
The above command returns a result in this form when there is a matching role:
{'ok': 1.0,
'roles': [{'db': 'test',
'inheritedPrivileges': [{'actions': ['find', 'insert', 'update'],
'resource': {'collection': 'test', 'db': 'test'}}],
'inheritedRoles': [],
'isBuiltin': False,
'privileges': [{'actions': ['find', 'insert', 'update'],
'resource': {'collection': 'test', 'db': 'test'}}],
'role': 'noremove',
'roles': []}]}
When there is no matching role, you get this result:
{'ok': 1.0, 'roles': []}
Checking that a role exists falls to checking for the length of the "roles" list in the returned result as follow:
noremove_role = db.command({
'rolesInfo': {'role': 'noremove','db': 'test'},
'showPrivileges': True, 'showBuiltinRoles': True
})
if not len(noremove_role['roles']):
# create role
pass
Is there a better way?
Yes, in keeping with ask forgiveness not permission philosophy, create the role and handle the resulting exception from trying to add an existing role.
from pymongo.errors import DuplicateKeyError
import logging
logger = logging.getLogger()
try:
db.command(
'createRole', 'noremove',
privileges=[{
'actions': ['insert', 'update', 'find'],
'resource': {'db': 'test', 'collection': 'test'}
}],
roles=[])
except DuplicateKeyError:
logger.error('Role already exists.')
pass
I have a collection called "tweets" stored in my mongodb database called "test".
I connect to the db in the following manner:
import sys
import pymongo
import Connection from pymongo
connection = Connection()
db = connection.test
tweets = db.tweets
I get one document from my tweets as a list comprehension:
list(tweets.find())[0]
This shows me that the structure of the document is as follows:
{u'_id': ObjectId('...'),
u'contributors': None,
u'coordinates': {u'coordinates': [-placeholder coordinate, placeholder coordinate],
u'type': u'Point'},
u'created_at': u'Mon Jun 24 17:53:47 +0000 2013',
u'entities': {u'hashtags': [],
u'symbols': [],
u'urls': [],
u'user_mentions': []},
u'favorite_count': 0,
u'favorited': False,
u'filter_level': u'medium',
u'geo': {u'coordinates': [40.81019674, -73.99020695], u'type': u'Point'},
u'id': 349223842700472320L,
u'id_str': u'349223842700472320',
u'in_reply_to_screen_name': None,
u'in_reply_to_status_id': None,
u'in_reply_to_status_id_str': None,
u'in_reply_to_user_id': None,
u'in_reply_to_user_id_str': None,
u'lang': u'en',
u'place': {u'attributes': {},
u'bounding_box': {u'coordinates': [[[placeholder coordinate, placeholder coordinate],
[-placeholder coordinate, placeholder coordinate],
[-placeholder coordinate, placeholder coordinate],
[-placeholder coordinate, placeholder coordinate]]],
u'type': u'Polygon'},
u'country': u'placeholder country',
u'country_code': u'example',
u'full_name': u'name, xx',
u'id': u'user id',
u'name': u'name',
u'place_type': u'city',
u'url': u'http://api.twitter.com/1/geo/id/1820d77fb3f65055.json'},
u'retweet_count': 0,
u'retweeted': False,
u'source': u'Twitter for iPhone',
u'text': u'example text',
u'truncated': False,
u'user': {u'contributors_enabled': False,
u'created_at': u'Sat Jan 22 13:42:59 +0000 2011',
u'default_profile': False,
u'default_profile_image': False,
u'description': u'example description',
u'favourites_count': 100,
u'follow_request_sent': None,
u'followers_count': 100,
u'following': None,
u'friends_count': 100,
u'geo_enabled': True,
u'id': placeholder_id,
u'id_str': u'placeholder_id',
u'is_translator': False,
u'lang': u'en',
u'listed_count': 0,
u'location': u'example place',
u'name': u'example name',
u'notifications': None,
u'profile_background_color': u'000000',
u'profile_background_image_url': u'http://a0.twimg.com/images/themes/theme19/bg.gif',
u'profile_background_image_url_https': u'https://si0.twimg.com/images/themes/theme19/bg.gif',
u'profile_background_tile': False,
u'profile_banner_url': u'https://pbs.twimg.com/profile_banners/241527685/1363314054',
u'profile_image_url': u'http://a0.twimg.com/profile_images/378800000038841219/8a71d0776da0c48dcc4ef6fee9f78880_normal.jpeg',
u'profile_image_url_https': u'https://si0.twimg.com/profile_images/378800000038841219/8a71d0776da0c48dcc4ef6fee9f78880_normal.jpeg',
u'profile_link_color': u'000000',
u'profile_sidebar_border_color': u'FFFFFF',
u'profile_sidebar_fill_color': u'000000',
u'profile_text_color': u'000000',
u'profile_use_background_image': False,
u'protected': False,
u'screen_name': placeholder screen_name',
u'statuses_count': xxxx,
u'time_zone': u'placeholder time_zone',
u'url': None,
u'utc_offset': -21600,
u'verified': False}}
I then query for all documents with hashtags in my collection:
list(tweets.find({'entities.hashtags.text': {"$ne":None}}))
So far so good. Now, here is my problem. I would like to sort the documents in my collection by screen_name. I try:
users = tweets.find({'entities.hashtags.text': {"$ne":None}}, {"user.screen_name":1})
for user in users:
print user.get["user.screen_name"]
but get the following error message:
TypeError Traceback (most recent call last)
/Users/home/<ipython-input-98-ea29cbbcfe27> in <module>()
1 for user in users:
----> 2 print user.get["screen_name"]
3
TypeError: 'builtin_function_or_method' object has no attribute '__getitem__'
Any idea what I'm doing wrong, here/any idea how I can fix my code?
Thanks!
You use brackets with the get method where you should use parentheses, so either access the key with user.get('screen_name') or user['screen_name'].