Can't insert into MongoDB due to AutoReconnect - mongodb

Background:
I've got a python script using pymongo that pulls some XML data, parses it into an array of dictionaries called 'all_orders'. I then try to insert it into the collection "orders" and I invariably get this exception. I am reasonably certain that my array of dictionaries is correct because when the list is small it tends to work (I think). I've also found that 8 out of the ~1300 documents I tried to insert into the collection worked.
Question:
Do you know what causes this AutoReconnect(str(e)) exception? Do you know how to work around or avoid this issue?
Error Trace:
File "mongovol.py", line 152, in get_reports
orders.insert(all_orders)
File "/Users/ashutosh/hrksandbox/lumoback-garden2/venv/lib/python2.7/site-packages/pymongo/collection.py", line 359, in insert
continue_on_error, self.__uuid_subtype), safe)
File "/Users/ashutosh/hrksandbox/lumoback-garden2/venv/lib/python2.7/site-packages/pymongo/mongo_client.py", line 853, in _send_message
raise AutoReconnect(str(e))

Related

MySQL: How do I remove/skip the first row (before the headers) from my stored procedure result?

I am calling a stored procedure which results in the following output
CALL `resale`.`reportProfitAndLossSummary`(3,' ',599025,TRUE);
OUTPUT:
"CONCAT('"',
CONCAT_WS('","',
"Promoter",
"Event",
"Event Description",
"Zone",
"Tickets Unsold",
"
""Promoter","Event","Event Description","Zone","Tickets Unsold","Avg. Unsold Price","Tickets Sold","Avg. Sold Price","Avg. Cost","Profit","Revenue""
""Qcue","10/2/2022 1:15 PM Pirates # Cardinals","Pirates # Cardinals","1/3B Field Box",0,,16,149.761250,42.000000,1724.18,2396.18"
I exported the result to .csv and discovered that a new code chunk is created above the header which distorts the structure of the file. Is there a way to skip this code chunk. I tried "-N" "-ss" since the code chunk appears as the header and none of those worked in MySQLWorkbench. Turning the header option to "FALSE" in the stored procedure call removes the actual headers and not the undesired code.
The stored procedure was developed by someone else so I am not sure where to begin fixing this. The goal is remove the undesired code from the query result itself not the .csv export.

How to fix data length error in postgresql and django

The project is in django and works pretty well with SQLite but when migrate to PostgreSQL and try to register a user shows this error
File "C:\Users\liz\developer\env\lib\site-packages\django\db\backends\utils.py", line 85, in _execute
return self.cursor.execute(sql, params)
django.db.utils.DataError: the value is too long for varchar type varying(30)
Already change the slug fields to 255 but error still there
Most of the answers say that must change the slugfields to 255, I looking for in vscode for the max_length=30 and I found a couple of fields with that length, I change all the fields to 255 and it worked, if you ever face this change all the fields in all models to 255 to make it works and then modify as you need

MongoDB find operation throws OperationFailure: Cannot update value

I have an application that uses MongoDB (on AWS DocumentDB) to stores documents with a large string in one of its fields which we call field X.
Few notes to start:
I'm using pymongo so the method names you might see here are taken from there
As of the nature of field X it is not being indexed
On field X we use MongoDB find method using a query with regex condition limiting it by both maxTimeMS and limit to a small amount of results.
When we get the results we iterate the cursor to fetch all the results to a list (inline loop).
Most of the times the query works properly but I'm starting to get more and more of the following error:
pymongo.errors.OperationFailure: Cannot update value (error code 14)
This is being thrown after the query return a cursor and we iterating the results and occurs after trying to _refresh the cursor connection by calling the next method and being thrown by _check_command_response at its last line meaning this is a default exception(?).
The query:
collection.find(condition).max_time_ms(MAX_QUERY_TIME_MS).sort(sort_order) \
.limit(RESULT_LIMIT)
results = [document for document in cursor] # <--- here we get the error
Stack trace:
pymongo/helpers.py in _check_command_response at line 155
pymongo/cursor.py in __send_message at line 982
pymongo/cursor.py in _refresh at line 1104
pymongo/cursor.py in next at line 1189
common/my_code.py in <listcomp> at line xxx
I'm trying to understand the origin of the exception to handle it correctly or use a different approach for handling the cursor.
what is being updated at the refresh method of the cursor that might
throw the above exception?
Thanks in advance.

tFileList into tFileInputXML is not working as expected

I have a tFileList iterating over a set of XML files.
And for testing I have 1 XML file in the folder:
THe XML schema has a field called "operationresultdesc":
In my tJavaFlex, I am outputting the value of this field and it returns a null value (I am outputting "null apparently").
This "NULL" check should fail as I can see values in the field in the XML.
The weird thing is, if I add more XMLs file to the folder, the null failure "passes" to another file, and the one failing before apparently is not null any more. This is the file which failed before with a null message, but now there is more than one XML file in the folder...no null issue (there never was a null issue, nor is there one in the file being flagged as null now).
For reference tJavaFlex code is:
UPDATE: This is based odd a repository XML schema driven off the XSD, and I do get the data flowing, the issue is the fact that I get a null when the file is on its own, but the same file works when I add more than one XML file in the folder...and the null failure passes to something else.
Feels like I am not understanding how the tFileList is iterating and how tJavaFlex is working in this case perhaps...but is very strange.

convert mongodb query (unicode) to json using json_util

from bson.json_util import dumps
def json_response(response):
return {"response":dumps(response,ensure_ascii=False).encode("utf8")
,"headers":{"Content-type":"text/json"}}
This problem is making me crazy. It returns an error randomly, and I can't find the solution.
/core/handlers/wsgi.py", line 38, in __call__,
output = lookup_view(req),
File "auth/decorator.py", line 8, in wrap,
return fn(req,*args,**kwargs),
File "auth/decorator.py", line 21, in wrap,
return fn(req,*args,**kwargs),
File "contrib/admin/views.py", line 67, in submit_base_premission,
return json_response({"baseperm":baseperm,"Meta":{"gmsg":u"...","type":201}}),
File "render/render_response.py", line 85, in json_response,
return {"response":dumps(response,ensure_ascii=False).encode("utf8"),
File "/usr/local/lib/python2.7/dist-packages/bson/json_util.py", line 116, in dumps,
return json.dumps(_json_convert(obj), *args, **kwargs),
File "/usr/lib/python2.7/json/__init__.py", line 238, in dumps, referer:
**kw).encode(obj),
File "/usr/lib/python2.7/json/encoder.py", line 201, in encode,
chunks = self.iterencode(o, _one_shot=True),
File "/usr/lib/python2.7/json/encoder.py", line 264, in iterencode,
return _iterencode(o, 0),
File "/usr/lib/python2.7/json/encoder.py", line 178, in default,
raise TypeError(repr(o) + " is not JSON serializable"),
TypeError: ObjectId('51f7dcee95113b7a48e974fe') is not JSON serializable,
baseperm is a pymongo Cursor, it returns this error randomly and that is where I have the problem.
It seems that it doesn't detect objectid sometimes and doesn't convert it to str so json raises an error on dumps.
Check the version of the pymongo driver, if it is under version 2.4.2+ then you may need to update it. Before that version the __str__ method of ObjectId was handled incorrectly for 2.x versions of python, check the repo: github, ObjectId.__str__ should return str in 2.x.
To check the pymongo driver version, type in the python shell:
import pymongo
print(pymongo.version)
UPDATE
I suppose you have tested both environments with the same dataset, so give a try to upgrade python 2.7.3 to 2.7.5.
Else try to iterate through the cursor and construct the list before giving it to json_response() i.e.:
baseperm = list(baseperm) #now baseperm is a list of the documents
...
my_response['baseperm'] = baseperm
my_response['Meta'] = ...
...
return json_response(my_response)
I report this problem on mongodb issue tracker
https://jira.mongodb.org/browse/PYTHON-548
answer:
You said this only happens occasionally? The only thing I can think of that might be related is mod_wsgi spawning sub interpreters. In PyMongo that tends to cause problems with the C extensions encoding python dicts to BSON. In your case this seems to be happening after the BSON documents are decoded to python dicts. It looks like isinstance is failing to match ObjectId in json_util.default(). PYTHON-539 seemed to be a similar problem related to some package miss configuration in the user's environment.
There could be a fairly large performance hit, but could you try running PyMongo without C extensions to see if that solves the problem?
You can read about the mod_wsgi issue here:
http://api.mongodb.org/python/current/faq.html#does-pymongo-work-with-mod-wsgi