Values for an attribute not assigned to the right variable - class

Sorry for the vague title I wasn't quite sure how to word it:
Let's say we have a class and it contains dictionaries embedded in a larger dictionary.
class AnObject(object):
grail = {'Grail' : '', 'Quest' : ''}
spam = {'More spam' : '', 'Less spam' : ''}
parrot = {'More parrot' : '', 'Less parrot' : '', 'Grail' : grail}
egg = {'Spam' : spam, 'Parrot' : parrot }
Then we want to call these attributes by other names
self.egg = egg
self.parrot = egg['Parrot']
self.moreparrot = self.parrot['More parrot']
This will bring up the right locations but for some reason I'm finding...
>>>knight = AnObject()
>>>knight.moreparrot = x
>>>knight.moreparrot
x
>>>knight.egg
{'Parrot' : {'More parrot' : '', 'Less parrot' : ''}...}
But this works fine:
>>>knight.egg['Parrot']['More parrot'] = x
>>>knight.egg['Parrot']['More parrot']
x
>>>knight.egg
{'Parrot' : {'More parrot' : '', 'Less parrot' : ''}...}
These should point to the same variable, but I am getting different results. How come?
Edit:
This might be totally accidental but for some reason Spam is consistent.
self.egg = egg
self.spam = egg['Spam']
self.morespam = self.spam['More spam']
>>>knight = AnObject()
>>>knight.morespam = x
>>>knight.morespam
x
>>>knight.egg['Spam']['More spam']
x
>>>knight.egg
{'Spam' : {'More spam' : 'x', 'Less spam' : ''}...}

Related

Extract first part of URL

I have an URL like so:
https://www.example.com/oauth/connect/token
I want to get only https://www.example.com and I have tried a bunch of ways to get this but all the ways I have tried require me to hardcode multiple /s.
Example:
$Url.Split('/')[0] + "//" + $url.split('/')[2]
Is there a way to do this without using the harcoding?
You may use [System.Uri], to 'cast' the string into an System.Uri class.
$url = [System.Uri]"https://www.example.com/oauth/connect/token"
You can then use the host and scheme to get the part of the URL you want.
$result = $url.Scheme, $url.Host -join "://"
You could also remove the AbsolutePath from the entire URL.
$url = [System.Uri]"https://www.example.com/oauth/connect/token"
$result = $url.AbsoluteUri.Replace($url.AbsolutePath, "")
This is the complete list of attributes that the System.Uri instance will have:
AbsolutePath : /oauth/connect/token
AbsoluteUri : https://www.example.com/oauth/connect/token
LocalPath : /oauth/connect/token
Authority : www.example.com
HostNameType : Dns
IsDefaultPort : True
IsFile : False
IsLoopback : False
PathAndQuery : /oauth/connect/token
Segments : {/, oauth/, connect/, token}
IsUnc : False
Host : www.example.com
Port : 443
Query :
Fragment :
Scheme : https
OriginalString : https://www.example.com/oauth/connect/token
DnsSafeHost : www.example.com
IdnHost : www.example.com
IsAbsoluteUri : True
UserEscaped : False
UserInfo :

Tombstone disappears when ExtractField$Key transform is added to connector config

I have the following connector declared with ksqldb:
CREATE
SOURCE CONNECTOR `myconn` WITH (
"name" = 'myconn',
"connector.class" = 'io.debezium.connector.mysql.MySqlConnector',
"tasks.max" = 1,
"database.hostname" = 'myconn-db',
"database.port" = '${dbPort}',
"database.user" = '${dbUsername}',
"database.password" = '${dbPassword}',
"database.history.kafka.topic" = 'myconn_db_history',
"database.history.kafka.bootstrap.servers" = '${bootstrapServer}',
"database.server.name" = 'myconn_db',
"database.allowPublicKeyRetrieval" = '${allowPublicKeyRetrieval}',
"table.include.list" = 'myconn.links,myconn.imports',
"message.key.columns" = 'myconn.links:id',
"tombstones.on.delete" = true,
"null.handling.mode" = 'keep',
"transforms" = 'unwrap',
"transforms.unwrap.type" = 'io.debezium.transforms.ExtractNewRecordState',
"transforms.unwrap.drop.tombstones" = false,
"transforms.unwrap.delete.handling.mode" = 'none'
);
Tombstones are successfully sent, but the key in messages is Struct(id=00000). In order to change the key by 00000, I've used ExtractField$Key transform:
CREATE
SOURCE CONNECTOR `myconn` WITH (
"name" = 'myconn',
"connector.class" = 'io.debezium.connector.mysql.MySqlConnector',
"tasks.max...
--- I omit all the rest for convenience ---
"transforms" = 'unwrap,extractKey',
---New lines added (next 3)
"transforms.extractKey.type" = 'org.apache.kafka.connect.transforms.ExtractField$Key',
"transforms.extractKey.field" = 'id',
"include.schema.changes" = false
);
Just adding the last three lines, now the keys are ok but tombstones disappear; no tombstone in the topic. Do you know the reason?
As you can see I have more than one table allowed in the white list (table.include.list). The second one has a different id field; not 'id' but 'import_id'. Seems like internally that field couldn't be properly extracted and tombstones (for all tables) was ignored.
I'm not sure about what's the reason of that behavior (no errors reported by doing describe connector myconn; something like 'key id not found' would have been useful) but I solved the issue just handling each topic with its proper key.
Here you have the new connector definition:
CREATE
SOURCE CONNECTOR `myconn` WITH (
"name" = 'myconn',
"connector.class" = 'io.debezium.connector.mysql.MySqlConnector',
"tasks.max" = 1,
--Database config--------------------------
"database.hostname" = 'myconn-db',
"database.port" = '${dbPort}',
"database.user" = '${dbUsername}',
"database.password" = '${dbPassword}',
"database.history.kafka.topic" = 'myconn_db_history',
"database.history.kafka.bootstrap.servers" = '${bootstrapServer}',
"database.server.name" = 'myconn_db',
"database.allowPublicKeyRetrieval" = '${allowPublicKeyRetrieval}',
"table.include.list" = 'myconn.links,myconn.imports',
--Connector behavior------------------------
"tombstones.on.delete" = true,
"null.handling.mode" = 'keep',
"include.schema.changes" = false,
--Predicates--------------------------------
"predicates" = 'TopicDoestHaveIdField,IsImportTopic',
"predicates.TopicDoestHaveIdField.type" = 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches',
"predicates.TopicDoestHaveIdField.pattern" = 'myconn_db.myconn\.(imports)',
"predicates.IsImportTopic.type" = 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches',
"predicates.IsImportTopic.pattern" = 'myconn_db.myconn.imports',
--Transforms--------------------------------
"transforms" = 'unwrap,extractKey,extractImportKey',
"transforms.unwrap.type" = 'io.debezium.transforms.ExtractNewRecordState',
"transforms.unwrap.drop.tombstones" = false,
"transforms.unwrap.delete.handling.mode" = 'none',
"transforms.extractKey.type" = 'org.apache.kafka.connect.transforms.ExtractField$Key',
"transforms.extractKey.field" = 'id',
"transforms.extractKey.predicate" = 'TopicDoestHaveIdField',
"transforms.extractKey.negate" = true,
"transforms.extractImportKey.type" = 'org.apache.kafka.connect.transforms.ExtractField$Key',
"transforms.extractImportKey.field" = 'import_id',
"transforms.extractImportKey.predicate" = 'IsImportTopic'
);
Now I have the tombstones in the topics and rows are properly removed from tables.

How to access values from error responses

Hi I am getting an error from the api response bases on the error code I have to show the error popup.
Error Domain=com.ca.mailina.targetAPI:ErrorDomain Code=1854 "The data
couldn’t be read because it isn’t in the correct format." UserInfo=
{mailinaInfoHeaderInfoKey=<CFBasicHash 0x600000bff000 [0x7fff8004b340]>{type =
immutable dict, count = 8,
entries =>
0 : Pragma = no-cache
1 : x-up-err = 0001854
2 : Content-Type = <CFString 0x6000011c63a0 [0x7fff8004b340]>{contents = "application/json;charset=UTF-8"}
3 : x-mail-err = 100
4 : x-uxp-err = 30102
6 : Date = <CFString 0x6000011c6370 [0xgghyd654fx40]>{contents = "Tue, 17 Aug 2021 10:37:19 GMT"}
10 : Content-Length = 907
11 : Cache-Control = no-store
}, NSLocalizedDescription=The data couldn’t be read because it isn’t in the correct format., status-code=401}
I have to check the condition based on 4 : x-uxp-err = 30102 which is on the fourth position of error response now the problem is I am not able to access the 4th passion of response, guide me to get the solution, example
if x-uxp-err == 30102 {
print("open popup 1")
} else {
print("open popup 2")
}

Python (Flask) MongoDB Speed Issue

I have a big speed problem on my website using Flask/MongoDB as backend. A basic request (get 1 user for example) takes about 4 sec to respond.
Here is the python code :
#users_apis.route('/profile/<string:user_id>',methods= ['GET','PUT','DELETE'])
#auth_token_required
def profile(user_id):
if request.method == "GET":
avatar = ''
if user_id == str(current_user.id):
if(current_user.birthday):
age = (date.today().year - current_user.birthday.year)
else:
age = ''
return make_response(jsonify({
"id" : str(current_user.id),
"username" : current_user.username,
"email" : current_user.email,
"first_name": current_user.first_name,
"last_name" : current_user.last_name,
"age" : age,
"birthday" : current_user.birthday,
"gender" : current_user.gender,
"city" : current_user.city,
"country" : current_user.country,
"languages" : current_user.languages,
"description" : current_user.description,
"phone_number" : current_user.phone_number,
"countries_visited" : current_user.countries_visited,
"countries_to_visit" : current_user.countries_to_visit,
"zip_code" : str(current_user.zip_code),
"address" : current_user.address,
"pictures" : current_user.pictures,
"avatar" : "",
"interests" : current_user.interests,
"messages" : current_user.messages,
"invitations" : current_user.invitations,
"events" : current_user.events
}), 200)
And my mongodb database is build like this :
The selected user is nearly empty (has no friends, no events, no pictures...).
class BaseUser(db.Document, UserMixin):
username = db.StringField(max_length=64, unique=True, required=True)
email = db.EmailField(unique=True, required=True)
password = db.StringField(max_length=255, required=True)
active = db.BooleanField(default=True)
joined_on = db.DateTimeField(default=datetime.now())
roles = db.ListField(db.ReferenceField(Role), default=[])
class User(BaseUser)
# Identity
first_name = db.StringField(max_length=255)
last_name = db.StringField(max_length=255)
birthday = db.DateTimeField()
gender = db.StringField(max_length=1,choices=GENDER,default='N')
# Coordinates
address = db.StringField(max_length=255)
zip_code = db.IntField()
city = db.StringField(max_length=64)
region = db.StringField(max_length=64)
country = db.StringField(max_length=32)
phone_number = db.StringField(max_length=18)
# Community
description = db.StringField(max_length=1000)
activities = db.StringField(max_length=1000)
languages = db.ListField(db.StringField(max_length=32))
countries_visited = db.ListField(db.StringField(max_length=32))
countries_to_visit = db.ListField(db.StringField(max_length=32))
interests = db.ListField(db.ReferenceField('Tags'))
friends = db.ListField(db.ReferenceField('User'))
friend_requests = db.ListField(db.ReferenceField('User'))
pictures = db.ListField(db.ReferenceField('Picture'))
events = db.ListField(db.ReferenceField('Event'))
messages = db.ListField(db.ReferenceField('PrivateMessage'))
invitations = db.ListField(db.ReferenceField('Invitation'))
email_validated = db.BooleanField(default=False)
validation_date = db.DateTimeField()
I have a debian serveur with 6Go Ram and 1 vcore, 2,4GHz.
When I check the log for the mongoDB I don't see request that takes more then 378ms (for a search request)
If I use TOP during a request on my server:
I see for 1 sec a 97% CPU use for Python during the request.
When I check the python server output :
I see 4 second between the Option request and the Get Request.
I finally managed to "fix" my issue.
It seems all the problem was due to the #auth_token_required.
Each request done by the front end to the back end with the "headers.append('Authentication-Token',currentUser.token);" created a huge delay.
I replaced #auth_token_required by #login_required.
I m now using cookies.
Hope it helps someone.

How to get the document with max value for a field with map-reduce in pymongo?

How do I find the document with the maximum uid field with map-reduce in pymongo?
I have tried the following but it prints out blanks:
from pymongo import Connection
from bson.code import Code
db = Connection().map_reduce_example
db.things.insert({
"_id" : "50f5fe8916174763f6217994",
"deu" : "Wie Sie sicher aus der Presse und dem Fernsehen wissen, gab es in Sri Lanka mehrere Bombenexplosionen mit zahlreichen Toten.\n",
"uid" : 13,
"eng" : "You will be aware from the press and television that there have been a number of bomb explosions and killings in Sri Lanka."
})
db.things.insert({
"_id" : "50f5fe8916234y0w3fvhv234",
"deu" : "Ich bin schwanger foo bar.\n",
"uid" : 14,
"eng" : "I am pregnant foobar."
})
db.things.insert({
"_id" : "50f5fe8916234y0w3fvhv234",
"deu" : "barbar schwarz sheep, haben sie any foo\n",
"uid" : 14,
"eng" : "barbar black sheep, have you any foo"
})
m = Code("function () {emit(this.uid,{'uid':this.uid,'eng':this.eng})}")
r = Code("function (key, values) {var total = 0;for (var i = 0; i < values.length; i++) {total += values[i];}return total;}")
result = db.things.inline_map_reduce(m, r)
for r in result:
print
An example document that look like these:
{
"_id" : ObjectId("50f5fe8916174763f6217994"),
"deu" : "Wie Sie sicher aus der Presse und dem Fernsehen wissen, gab es mehrere Bombenexplosionen mit zahlreichen Toten.\n",
"uid" : 13,
"eng" : "You will be aware from the press and television that there have been a
number of bomb explosions and killings."
}
You can use find_one to find the doc with the maximum uid by sorting on that field descending:
db.things.find_one(sort=[("uid", -1)])
or using the defined constant:
db.things.find_one(sort=[("uid", pymongo.DESCENDING)])