How can I migrate Odoo old binary field to new odoo versions with attachment=True? - postgresql

I have an old odoo version(v6) and I am migrating it to odoo-10, Issue I am facing is for binary field data migration. As odoo-10 has attribute "attachment=True", but for older versions this was not there.
So can I get little idea from stack community, about how can I achieve my task and how can I migrate that postgres table to odoo-10 compatible data. Thanks in advance.

Just migrate the data as is, let them exist in database. I had to write a module to achieve the same requirement, because a customer had attachments in database instead of using attachments.
The following code works, it's not offically in my company's apps in Odoo's App Store, but eventually will find its way into it ;-)
from odoo import api, models, exceptions
from odoo.osv import expression
class IrAttachment(models.Model):
""" Attachment Extensions"""
_inherit = 'ir.attachment'
#api.model
def _relocate_binary_data(
self, model=None, fields=None, domain=None, limit=0):
""" Relocates binary data into attachments. This method
has no functionality to reverse the process.
Use this to change binary fields to attachment usage,
which is done by using the parameter attachment=True
#param model: Model Name (required)
#param fields: List of binary field names (required)
#param domain: optional search domain to filter treated records
(default: []==no filter)
#param limit: optional filter limit (default: 0==unlimited)"""
if not model or not fields:
raise exceptions.Warning(
"model and fields are required parameters")
# only touch records with binary data in one of the provided fields
default_domain = [[(f, '!=', False)] for f in fields]
default_domain = expression.OR(default_domain)
domain = expression.AND([domain, default_domain])
records = self.env[model].with_context(active_test=False).search(
domain, limit=limit)
# relocate the binary data to attachments
for record in records:
for field in fields:
# search for existing attachments (for re-runs)
attachment = records.env['ir.attachment'].sudo().search([
('res_model', '=', record._name),
('res_field', '=', field),
('res_id', '=', record.id),
])
# write the binary value to existing attachment or create one
if attachment:
attachment.write({'datas': getattr(record, field)})
else:
self.env['ir.attachment'].create({
'name': record.name,
'res_model': record._name,
'res_field': field,
'res_id': record.id,
'type': 'binary',
'datas': getattr(record, field)
})
# empty the database binary data
records.write({f: None for f in fields})
You have to write a ir.cron or a ir.actions.server to use this method.

If you look at the read function for the Binary class (<path_to_v12>/odoo/fields.py lines 1786-1800, cited below) you'll notice that it searches ir.attachment for records having the right model, field and id.
def read(self, records):
# values are stored in attachments, retrieve them
assert self.attachment
domain = [
('res_model', '=', records._name),
('res_field', '=', self.name),
('res_id', 'in', records.ids),
]
# Note: the 'bin_size' flag is handled by the field 'datas' itself
data = {att.res_id: att.datas
for att in records.env['ir.attachment'].sudo().search(domain)}
cache = records.env.cache
for record in records:
cache.set(record, self, data.get(record.id, False))
So, my educated guess is that you can update your 'ir_attachment' records and adding res_model (note that this is a string!), res_field (also a string) and res_id (this is the integer saved on the id field of the referring record).

Best is to use XMLRPC to read the data from SRC and write the data in DEST. Which would take care of your issue. It would read the data from the binary field while creating attachment it would store in the filesystem.

Related

Is there a way in play-json to read a key/value map, remembering the original ordering?

Given:
{
...
"fruits": {
"apple": { ... },
"banana": { ... },
"cherry": { ... },
...
"watermelon": { ... }
}
}
Is there a way in Scala to read this JSON map fruits of String -> object while remembering that the ordering of the keys was originally apple, banana, cherry ... watermelon?
[below added because I was asked why I wanted this and to provide a test example]
Normally I wouldn't care about the ordering (of a Map). I do not control the format of the input; it is a Map, not an Array. The real input is not fruits, it is alerts with alphanumeric keys, I just picked fruit names for simplicity. I am building test files based on the data. Suppose there were ten items in the first file, and I deleted "watermelon" (the 10th object) from the 2nd file. The code that read the first file put the objects in to a database. When it processes the objects (alerts), each produces an action. A test result is an EventAction(id:Long,action:String). The id is an auto-increment Long from the database; I do not control that. After processing the first file, it turns out that the alert associated with "watermelon" was created with an id of 2, not 10. When I'm building my test for the processing of the second file (the one without "watermelon"), if I think the id will be 10, the test will fail not because I predicted the action incorrectly, but because I didn't know the id would be 2 instead of 10.
One of way of dealing with Map ("you shouldn't care about ordering so you won't get any clues as to the original ordering in the JSON file") is I can build ad-hoc SQL to find out what database id was created for each key, just for the tests. Before I write ad-hoc SQL (the company normally asks all DB interactions be through stored procedures written by a DBA), I thought, "Wouldn't it be neat, at the time of reading the JSON, to remember the ordering in the moment, before it is lost."
Play-json(at least 2.6.9) parses js object from top to bottom collecting all fields to content: ListBuffer[(String, JsValue)]. When all fields in the object are parsed, then JsObject is instantiated via apply method
def apply(fields: Seq[(String, JsValue)]): JsObject = new JsObject(mutable.LinkedHashMap(fields: _*))
So the answer to your question: if you use the default Reads for Map from the library, the ordering is already kept.

Get SOST database ID of sent emails

I have an ABAP program that sends emails. A sent email is stored in SOOD table. After sending an email I would like to get some ID of the email to be able to check its status later (in SOST table). I have seen more functions/methods to send email (e.g. cl_bcs/send, SO_NEW_DOCUMENT_SEND_API1), but none of them returns any ID. Is there a reliable way to get it?
Function module SO_NEW_DOCUMENT_SEND_API1 create and export a new OBJECT_ID for every new message sent, As you can see in here -
This NEW_OBJECT_ID stored at BCST_SR table in SCOM_KEY field. From BCST_SR table you've to get DOC_OID, using DOC_OID you can get details from SOOD table. (Reference field in SOOD is - IF_DOC_BCS ) Then use the Object number OBJNO to get the details from SOST table.
Also you can refer t-code SBWP to check your mail status.
For class CL_BCS, you can check the send_request object's method doc_wrapper_id. This will return the sood structer.
Two other answers gave me together valuable clues to get it done (+1). But both missed some accuracy and code snippets, so I sum it all up in my answer.
using cl_bcs
DATA gr_send_request TYPE REF TO cl_bcs.
DATA emailid LIKE soodk.
gr_send_request = cl_bcs=>create_persistent( ).
" ...
CALL METHOD gr_send_request->send(EXPORTING i_with_error_screen = 'X'
RECEIVING result = gv_sent_to_all ).
IF gv_sent_to_all = 'X'.
emailid = gr_send_request->send_request->doc_wrapper_id( ).
ENDIF.
SOODK (not sood) is structure containing three components (OBJTP, OBJYR, OBJNO) which are together the key in SOOD table.
using SO_NEW_DOCUMENT_SEND_API1
DATA LT_OBJECTID TYPE SOFOLENTI1-OBJECT_ID.
CALL FUNCTION 'SO_NEW_DOCUMENT_SEND_API1'
EXPORTING
DOCUMENT_DATA = LT_MAILSUBJECT
DOCUMENT_TYPE = 'HTM'
IMPORTING
new_object_id = lt_objectid
" ...
lt_objectid (SOFOLENTI1-OBJECT_ID) is char(17), that contains concatenated SOODK structure OBJTP+OBJYR+OBJNO. When divided to parts, it can be used to lookup a record in SOODK table. (I didn't find it in BCST_SR-SCOM_KEY, but it was not necessary.)

How to keep original field name in Request Facade ($request) data when submit form in Laravel

I am working with existed SQL Server Database that most table field names have space. Example: Family Name
In my update view file. I have created field name with space as well (Family Name) to match with table field name.
When I try to update the model I got the problem with $request->all() method, because Request facade convert the input field name from Family Name to Family_Name. so I have to manually write the updating code in very fields that I got
$trainee = Trainee::findOrFail($id);
$trainee->{'Family Name'} = $request->input('Family_Name');
$trainee->{'Sex'} = $request->input('Sex');
....... = ........
$trainee->save();
instead of
$trainee = Trainee::findOrFail($id);
$input = $request->all();
$trainee->fill($input)->save();
IT IS A HUGE PAIN as I need to write many line of code repeatedly instead of using $input = $request->all();
Question:
How to prevent Request Facade from converting the input field name from Family Name to Family_Name?

mirth connect Database Reader automatic column mapping

Please could somebody confirm the following..
I am using Mirth Connect 3.5.08232.
My Source Connector is a Database Reader.
Say, I am using a query that returns multiple rows, and return the result (via JavaScript), as documentation suggests, so that Mirth would treat each row as a separate message. I also use a couple of mappers as source transformers, and save the mapped fields in my channel map (which ends up to contain only those fields that I define in transformers)
In the destination, and specifically, in destination response transformer (or destination body, if it is a JavaScript writer), how do I access the source fields?
the only way I found by trial and error is
var rawMsg = connectorMessage.getRawData();
var xmlMsg = new XML(rawMsg);
logger.info(xmlMsg.some_field); // ignore the root element of rawMsg
Is this the right way to do this? I thought that maybe the fields that were nicely automatically detected would be put in some kind of a map, like sourceMap - but that doesn't seem to be the case, right?
Thank you
If you are using Mapper steps in your transformer to extract the data and put it into a variable map (like the channel map), then you can use any of the following methods to retrieve it from a subsequent JavaScript context (including a JavaScript Writer, and your response transformer):
var value = channelMap.get('key');
var value = $c('key');
var value = $('key');
Look at the Variable Maps section of the User Guide for more information.
So to recap, say you're selecting a column "mycolumn" with a Database Reader. The XML sent to the channel will be something like this:
<result>
<mycolumn>value</mycolumn>
</result>
Then you can choose to extract pieces of that message into specific variables for later use. The transformer allows you to easily drag-and-drop pieces of the sample inbound message.
Finally in your JavaScript Writer (or in any subsequent filter, transformer, or response transformer), just drag the value into the field you want:
And the corresponding JavaScript code will automatically be inserted:
One last note, if you are selecting a lot of variables and don't want to make Mapper steps for each one individually, you can use a JavaScript Step to iterate through the message and extract each column into a separate map variable:
for each (child in msg.children()) {
channelMap.put(child.localName(), child.toString());
}
Or, you can just reference the columns directly from within the JavaScript Writer:
var msg = new XML(connectorMessage.getEncodedData());
var column1 = msg.column1.toString();
var column2 = msg.column2.toString();
...

Though I have a record in database, it's giving "Mongoid::Errors::DocumentNotFound"

Though I have the record with id 13163 (db.locations.find({_id: 13163})), it's giving me error:
Mongoid::Errors::DocumentNotFound in LocationsController#show
Problem: Document(s) not found for class Location with id(s) 13163.
Summary: When calling Location.find with an id or array of ids, each
parameter must match a document in the database or this error will be
raised. The search was for the id(s): 13163 ... (1 total) and the
following ids were not found: 13163. Resolution: Search for an id that
is in the database or set the Mongoid.raise_not_found_error
configuration option to false, which will cause a nil to be returned
instead of raising this error when searching for a single id, or only
the matched documents when searching for multiples.
# Use callbacks to share common setup or constraints between actions.
def set_location
#location = Location.find(params[:id])
end
locations_controller.rb:
class LocationsController < ApplicationController
before_action :set_location, only: [:show, :edit, :update, :destroy]
# GET /locations
# GET /locations.json
def index
#locations = Location.all
end
# GET /locations/1
# GET /locations/1.json
def show
end
private
# Use callbacks to share common setup or constraints between actions.
def set_location
#location = Location.find(params[:id])
end
# Never trust parameters from the scary internet, only allow the white list through.
def location_params
params.require(:location).permit(:loc_name_en, :loc_name_jp, :channel)
end
end
Setting up the option raise_not_found_error: false is not the case as I do have a document in database.
SOLUTION:
Big thanks to #mu is too short for giving me a hint.
The problem can be solved in 2 ways:
Declare field :_id, type: Integer in the model location.rb
Or converting the passing parameter to Integer like Location.find(params[:id].to_i) in locations_controller.rb as shown below in the #mu is too short's answer
I'd guess that you have a type problem. You say that this:
db.locations.find({_id: 13163})
finds the document in the MongoDB shell. That means that you have a document in the locations collection whose _id is the number 13163. If you used the string '13163':
db.locations.find({_id: '13163'})
you won't find your document. The value in params[:id] is probably a string so you're saying:
Location.find('13163')
when you want to say:
Location.find(13163)
If the _id really is a number then you'll need to make sure you call find with a number:
Location.find(params[:id].to_i)
You're probably being confused because sometimes Mongoid will convert between Strings and Moped::BSON::ObjectIds (and sometimes it won't) so if your _id is the usual ObjectId you can say:
Model.find('5016cd8b30f1b95cb300004d')
and Mongoid will convert that string to an ObjectId for you. Mongoid won't convert a String to a number for you, you have to do that yourself.