How To Send a PDF File to a Progress AppServer? - progress-4gl

I have a PDF file at client and i want to send this PDF file on AppServer. How can i send this pdf file at AppServer?

define temp-table ttFileList no-undo
field file-id as integer
field file-content as blob.
create ttFileList.
assign ttFileList.file-id = 1.
copy-lob from file("pdffilename") to ttFileList.file-content.
run DoSomethingWithAPDF on hAppServer
( input table ttFileList ).

This depends on the version of progress you are using, if you are using v9 then you will need to use small chunks of raw data streamed in segments. With OpenEdge (might have been 10.1B) we got CLOB and BLOB support, you can create a procedure which takes a temp-table as an argument.
It also depends on your calling language. For .NET and Java this will get translated into a byte array.
For your app-server create a procedure similar to the following:
def temp-table ObjectTransfer no-undo
field Code as char
field Number as int
field DataContent as blob
field MimeType as char.
procedure AddObjectData:
def input param table for ObjectTransfer.
def var k as int no-undo.
for each ObjectTransfer:
find last ObjectTable no-lock
where ObjectTable.Code = ObjectTransfer.Code
no-error.
if avail ObjectTable then
k = ObjectTable.Number + 1.
else
k = 1.
create ObjectTable.
assign
ObjectTable.Code = ObjectTransfer.Code
ObjectTable.Number = k
ObjectTable.MimeType = ObjectTransfer.MimeType
ObjectTable.DataContent = ObjectTransfer.DataContent
.
end.
end procedure.
Generate proxies, you will now call this from .NET and Java using a simple byte array as an input temp-table data-type.

Use raw datatype, you might need to send the file in chunks. Another alternative is to use character+BASE64.

Related

Modelica (Dymola) : get a particular value of a timetable?

I have a model using a timeTable which represents a variable evolution. I would like to initialize a subcomponent's parameter with the first value of the table (time = 0 second).
The table's values are read from a .txt file. The idea would be to have a command as follow :
parameter Real InitialValue = timeTable.y[2](for timeTable.y[1] = 0)
Is there a command to do so ?
In some cases another option is to initialize the parameter at the output value of that table-component when starting the simulation:
model Demo
Modelica.Blocks.Sources.TimeTable timeTable(table=[0,1; 2,3]);
parameter Real initialValue(fixed=false);
initial equation
initialValue = timeTable.y;
end Demo;
This works for all variants in the same way, but only for the initial value. It is triggered by having fixed=false for a parameter and then giving an initial equation for it.
The solution depends on the block you are using and how the data is defined. Note that there is no easy solution for .txt files, so I recommend using .mat files instead.
1. Data from model
If you don't read from a file it is quite easy.
The data is stored as matrix in the parameter table and we can use array indexing to access it:
model Demo
Modelica.Blocks.Sources.TimeTable timeTable(table=[0,1; 2,3]);
parameter Real initialValue = timeTable.table[1, 2];
end Demo;
This works for both, the Modelica.Blocks.Sources.TimeTable and the CombiTimeTable found in the same package.
2. Data from .mat file
The MSL provides functions to access .mat files. You have to get the table size before you can read the data.
See the code below how this can be done.
model Demo2
import Modelica.Utilities.Streams.{readMatrixSize, readRealMatrix};
parameter String fileName = "C:/tmp/table.mat";
parameter String tableName = "tab1";
parameter Real initialValue = (readRealMatrix(fileName=fileName, matrixName=tableName, nrow=matrixSize[1], ncol=matrixSize[2]))[1, 2];
Modelica.Blocks.Sources.CombiTimeTable combiTimeTable(
tableOnFile=true,
tableName=tableName,
fileName=fileName)
annotation (Placement(transformation(extent={{-10,-10},{10,10}})));
protected
final parameter Integer matrixSize[2] = readMatrixSize(fileName, tableName);
end Demo2;
Note that we don't store the whole table in a variable. Instead,
we read it and access the element of intereset with [1, 2]. This requires putting brackets around the function call.

OpenEdge Progress 4GL Query returns (MISSING) after % sign

DEFINE TEMP-TABLE tt_pay_terms NO-UNDO
FIELD pt_terms_code LIKE payment_terms.terms_code
FIELD pt_description LIKE payment_terms.description.
DEFINE VARIABLE htt AS HANDLE NO-UNDO.
htt = TEMP-TABLE tt_pay_terms:HANDLE.
FOR EACH platte.payment_terms
WHERE (
active = true
AND system_id = "000000"
)
NO-LOCK:
CREATE tt_pay_terms.
ASSIGN
pt_terms_code = payment_terms.terms_code.
pt_description = payment_terms.description.
END.
htt:WRITE-JSON("FILE", "/dev/stdout", FALSE).
I have written this query and it returns data like this
[pt_terms_code] => 0.4%!N(MISSING)ET46
[pt_description] => 0.4%! (MISSING)DAYS NET 46
While I believe (from using a SQL query) that the data should be
0.4%45NET46
0.4% 45 DAYS NET 46
I'm making an assumption that the % is probably some special character (as I've run into similar issues in the past). I've tried pulling all the data from the table, and I get the same result, (ie, not creating a temp table and populating it with all the only the two fields I want).
Any suggestions around this issue?
I'm still very new to 4gl, so the above query might be terribly wrong. All comments and criticisms are welcome.
I suspect that if you try this:
FOR EACH platte.payment_terms NO-LOCK
WHERE ( active = true AND system_id = "000000" ):
display
payment_terms.terms_code
payment_terms.description
.
END.
You will see what the query actually returns. (WRITE-JSON is adding a layer after the query.) You will likely discover that your data contains something unexpected.
To my eye the "%" looks more like formatting -- the terms are likely 0.4%.
You then seem to have some issues in the contents of the description field. My guess is that there was a code page mismatch when the user entered the data and that there is gibberish in the field as a result.

Passing empty DataSet to Appserver

I'm trying to create a single proxy query to our appserver, which uses the following parameters:
editTest.p
DEFINE TEMP-TABLE TT_Test NO-UNDO
BEFORE-TABLE TT_TestBefore
FIELD fieldOne LIKE MyDBTable.FieldOne
FIELD fieldTwo LIKE MyDBTable.FieldTwo
FIELD fieldThree LIKE MyDBTable.FieldThree
.
DEFINE DATASET dsTest FOR TT_Test.
/* Parameters */
DEF INPUT-OUTPUT PARAM DATASET FOR dsTest.
The idea is that the program would call this procedure in 2 different ways:
with passed dataset parameter: read passed dataset and update db according to it's changes
without passed dataset parameter/unknown/unset: fill TT_Test and return dataset to client for editing
Is there any way to create a proxy like this? Easy solution would be to separate the get and insert,modify,delete to 2 own proxy files, so the client would always first get the dataset and then pass it for for the second one. However, I'd like to implement this functionality into this one file.
The key is to use the datasets, so the changes made to the data can be updated almost automatically.
Instead of using the dataset itself as the parameter, use a dataset handle. You can then make it null for your 2nd condition. Adding on to your example, procedure testProc will display a message "yes" when the dataset is passed in via the handle, and "no" when null is passed in.
DEFINE TEMP-TABLE TT_Test NO-UNDO
BEFORE-TABLE TT_TestBefore
FIELD fieldOne LIKE MyDBTable.FieldOne
FIELD fieldTwo LIKE MyDBTable.FieldTwo
FIELD fieldThree LIKE MyDBTable.FieldThree
.
DEFINE DATASET dsTest FOR TT_Test.
PROCEDURE testProc:
DEFINE INPUT-OUTPUT PARAMETER DATASET-HANDLE phDataSet.
MESSAGE VALID-HANDLE(phDataSet) VIEW-AS ALERT-BOX.
END.
DEFINE VARIABLE hTest AS HANDLE NO-UNDO.
/* Pass in a dataset. */
hTest = DATASET dsTest:HANDLE.
RUN testProc (INPUT-OUTPUT DATASET-HANDLE hTest).
/* Pass in null. */
hTest = ?.
RUN testProc (INPUT-OUTPUT DATASET-HANDLE hTest).

Insert image encoded in base 64 in a word document with python-docx?

I use python-docx to generate word document. the user want that he create a template(in a field description) and when he write for example %(company_logo)s in the template, I replace this expression by the picture of the company that I recupered from the database.
as a first issue, I recupered the logo of a company from the database(Postgresql) and I use this code to replace this expression:
cr.execute("select name, logo_web from res_company where id=%s",[soc_id])
r=cr.fetchone()
if r :
company_name=r[0]
logo_company = r[1]
output = cStringIO.StringIO()
doc = docx.Document()
contenu=contenu % {'company_logo': logo_company, 'company_name': company_name,}
doc.add_paragraph(contenu)
The output was a document word that contains the base 64 code of the image as a string. I decoded this code and I tried to add it as a picture with the following code:
logo_company = base64.b64decode(r[1])
doc.add_picture(logo_company)
But I have this error that tells to me that argument must be the path to the picture.
TypeError: file() argument 1 must be encoded string without NULL bytes, not str
The documentation here explains that the add_picture() method takes a file as an argument. The file can be in the form of a path, or it can be a file-like object, such as an open file or a StringIO object. It cannot accept a bytestring containing the bytes of the image, which is what you've tried to do.
So you'll need to convert the image bytes into a file-like object, perhaps using StringIO(), and hand the resulting file-like object to add_picture(). That will get it working for you. Something like:
logo_file = StringIO(base64.b64decode(r[1]))
doc.add_picture(logo_file)

Why does Open XML API Import Text Formatted Column Cell Rows Differently For Every Row

I am working on an ingestion feature that will take a strongly formatted .xlsx file and import the records to a temp storage table and then process the rows to create db records.
One of the columns is strictly formatted as "Text" but it seems like the Open XML API handles the columns cells differently on a row-by-row basis. Some of the values while appearing to be numeric values are truly not (which is why we format the column as Text) -
some examples are "211377", "211727.01", "209395.388", "209395.435"
what these values represent is not important but what happens is that some values (using the Open XML API v2.5 library) will be read in properly as text whether retrieved from the Shared Strings collection or simply from InnerXML property while others get sucked in as numbers with what appears to be appended rounding or precision.
For example the "211377", "211727.01" and "209395.435" all come in exactly as they are in the spreadsheet but the "209395.388" value is being pulled in as "209395.38800000001" (there are others that this happens to as well).
There seems to be no rhyme or reason to which values get messed up and which ones which import fine. What is really frustrating is that if I use the native Import feature in SQL Server Management Studio and ingest the same spreadsheet to a temp table this does not happen - so how is that the SSMS import can handle these values as purely text for all rows but the Open XML API cannot.
To begin the answer you main problem seems to be values,
"209395.388" value is being pulled in as "209395.38800000001"
Yes in .xlsx file value is stored as 209395.38800000001 instead of 209395.388. And it's the correct format to store floating point numbers; nothing wrong in it. You van simply confirm it by following code snippet
string val = "209395.38800000001"; // <= What we extract from Open Xml
Console.WriteLine(double.Parse(val)); // < = Simply pass it to double and print
The output is :
209395.388 // <= yes the expected value
So there's nothing wrong in the value you extract from .xlsx using Open Xml SDK.
Now to cells, yes cell can have verity of formats. Numbers, text, boleans or shared string text. And you can styles to a cell which would format your string to a desired output in Excel. (Ex - Date Time format, Forced strings etc.). And this the way Excel handle the vast verity of data. It need this kind of formatting and .xlsx file format had to be little complex to support all.
My advice is to use a proper parse method set at extracted values to identify what format it represent (For example to determine whether its a number or a text) and apply what type of parse.
ex : -
string val = "209395.38800000001";
Console.WriteLine(float.Parse(val)); // <= Float parse will be deduce a different value ; 209395.4
Update :
Here's how value is saved in internal XML
Try for yourself ;
Make an .xlsx file with value 209395.388 -> Change extention to .zip -> Unzip it -> goto worksheet folder -> open Sheet1
You will notice that value is stored as 209395.38800000001 as scene in attached image.. So nothing wrong on API for extracting stored number. It's your duty to decide what format to apply.
But if you make the whole column Text before adding data, you will see that .xlsx hold data as it is; simply said as string.