I would like an really easy way to see the content in any temptable in Progress?
You could also for instance produce XML
tthTmp:WRITE-XML("FILE","c:\temp\tt.xml", TRUE).
or (maybe not quite as easy) output as a semi colon delimited file
OUTPUT TO c:\temp\file.txt.
FOR EACH ttTmp:
EXPORT DELIMITER ";" ttTmp.
END.
OUTPUT CLOSE.
I just found out an easy way to dump a temp-table to file using Json (from 10.2B).
WRITE-JSON is the trick!!
DEFINE TEMP-TABLE ttTmp
FIELD FieldA AS CHAR
FIELD FieldB AS CHAR.
CREATE ttTmp.
ASSIGN ttTmp.FieldA = "A"
ttTmp.FieldB = "B".
DEFINE VARIABLE tthTmp AS HANDLE NO-UNDO. /* Handle to temptable */
DEFINE VARIABLE lReturnValue AS LOGICAL NO-UNDO.
tthTmp = TEMP-TABLE ttTmp:HANDLE.
lReturnValue = tthTmp:WRITE-JSON("FILE", "c:\temp\tthTmp.txt", TRUE, ?).
/* Output File tthTmp.txt
{"ttTmp": [
{
"FieldA": "A",
"FieldB": "B"
}
]}
Output File tthTmp.txt */
Related
I often do the following progress 4GL code
output to /OUTText.txt.
def var dRow as char.
dRow = "cmpid|CustNum|Cur".
put unformatted dRow skip.
for each Cust no-lock:
dRow = subst("&1|&2|&3", Cust.CmpId, Cust.CustNum, Cust.Curr).
put unformatted dRow skip.
end.
output close.
in order to mimic
select * from cust (in MS SQL)
my question is is there a way to make this block of code, even closely resemblance "select *" using 4GL. Such that I don't have to type each column name and it will print all values in all columns. my thinking is. something like this.
output to /OUTText.txt.
def var dRow as char.
dRow = "cmpid|CustNum|Cur".
put unformatted dRow skip.
for each Cust no-lock:
if row = 1 then do:
for each Column in Cust:
**'PRINT THE COLUMN HEADER**
end.
end.
else do:
**'PRINT EACH CELL**
end.
end.
output close.
If there is such thing. then I don't have to keep explicit column name in dRow.
You can do what you're after if you first output all field labels (or names) and then use EXPORT to output the table content.
To change to field name instead of label: change :LABEL below to :NAME
For instance:
DEFINE VARIABLE i AS INTEGER NO-UNDO.
OUTPUT TO c:\temp\somefile.txt.
DO i = 1 TO BUFFER Customer:NUM-FIELDS.
PUT QUOTER(BUFFER Customer:BUFFER-FIELD(i):LABEL).
IF i < BUFFER Customer:NUM-FIELDS THEN
PUT UNFORMATTED ";".
ELSE IF i = BUFFER Customer:NUM-FIELDS THEN
PUT SKIP.
END.
FOR EACH Customer NO-LOCK:
EXPORT DELIMITER ";" Customer.
END.
OUTPUT CLOSE.
You could put the header part in a separate program to call dynamically every time you want to do something similar:
DEFINE STREAM str.
OUTPUT STREAM str TO c:\temp\somefile.txt.
RUN putHeaders.p(INPUT BUFFER Customer:HANDLE, INPUT ";", INPUT STREAM str:HANDLE).
FOR EACH Customer NO-LOCK:
EXPORT STREAM str DELIMITER ";" Customer.
END.
OUTPUT STREAM str CLOSE.
putHeaders.p
============
DEFINE INPUT PARAMETER phBufferHandle AS HANDLE NO-UNDO.
DEFINE INPUT PARAMETER pcDelimiter AS CHARACTER NO-UNDO.
DEFINE INPUT PARAMETER phStreamHandle AS HANDLE NO-UNDO.
DEFINE VARIABLE i AS INTEGER NO-UNDO.
DO i = 1 TO phBufferHandle:NUM-FIELDS.
PUT STREAM-HANDLE phStreamHandle UNFORMATTED QUOTER(phBufferHandle:BUFFER-FIELD(i):LABEL).
IF i < phBufferHandle:NUM-FIELDS THEN
PUT STREAM-HANDLE phStreamHandle UNFORMATTED pcDelimiter.
ELSE IF i = phBufferHandle:NUM-FIELDS THEN
PUT STREAM-HANDLE phStreamHandle SKIP.
END.
output to "somefile".
for each customer no-lock:
display customer.
end.
I wouldn't generally mention this as the embedded SQL-89 within the 4GL is the highway to hell (that dialect of SQL only works for the most basic and trivial of purposes and really shouldn't be used at all in production code), but as it happens:
output to "somefile".
select * from customer.
does just happen to work to the spec of the original question (although, like the DISPLAY solution, it also does not support a delimiter...)
I'm writing a program in Progress, OpenEdge, ABL, and whatever else it's known as.
I have a CSV file that is delimited by commas. However, there is a "gift message" field, and users enter messages with "commas", so now my program will see additional entries because of those bad commas.
The CSV fields are not in double qoutes so I CAN NOT just use my main method with is
/** this next block of code will remove all unwanted commas from the data. **/
if v-line-cnt > 1 then /** we won't run this against the headers. Otherwise thhey will get deleted **/
assign
v-data = replace(v-data,'","',"\t") /** Here is a special technique to replace the comma delim wiht a tab **/
v-data = replace(v-data,','," ") /** now that we removed the comma delim above, we can remove all nuisance commas **/
v-data = replace(v-data,"\t",'","'). /** all nuisance commas are gone, we turn the tabs back to commas. **/
Any advice?
edit:
From Progress, I cal call Linux commands. So I should be able to execute C++/PHP/Shell etc all from my Progress Program. I look forward to advice, until then I shall look into using external scripts.
You are not providing quite enough data for a perfect answer but given what you say I think the IMPORT statement should handle this automatically.
In my example here commaimport.csv is a comma-separated csv-file with quotes around text fields. Integers, logical variables etc have no quotes. The last field contains a comma in one line:
commaimport.csv
=======================
"Id1", 123, NO, "This is a message"
"Id2", 124, YES, "This is a another message, with a comma"
"Id3", 323, NO, "This is a another message without a comma"
To import this file I define a temp-table matching the file layout and use the IMPORT statement with comma as delimiter:
DEFINE TEMP-TABLE ttImport NO-UNDO
FIELD field1 AS CHARACTER FORMAT "xxx"
FIELD field2 AS INTEGER FORMAT "zz9"
FIELD field3 AS LOGICAL
FIELD field4 AS CHARACTER FORMAT "x(50)".
INPUT FROM VALUE("c:\temp\commaimport.csv").
REPEAT :
CREATE ttImport.
IMPORT DELIMITER "," ttImport.
END.
INPUT CLOSE.
FOR EACH ttImport:
DISPLAY ttImport.
END.
You don't have to import into a temp-table. You could import into variables instead.
DEFINE VARIABLE c AS CHARACTER NO-UNDO FORMAT "xxx".
DEFINE VARIABLE i AS INTEGER NO-UNDO FORMAT "zz9".
DEFINE VARIABLE l AS LOGICAL NO-UNDO.
DEFINE VARIABLE d AS CHARACTER NO-UNDO FORMAT "x(50)".
INPUT FROM VALUE("c:\temp\commaimport.csv").
REPEAT :
IMPORT DELIMITER "," c i l d.
DISP c i l d.
END.
INPUT CLOSE.
This will render basically the same output:
You don't show what your data file looks like. But if the problematic field is the last one, and there are no quotes, then your best bet is probably to read it using INPUT UNFORMATTED to get it a line at a time, and then split the line into fields using ENTRY(). That way you can treat everything after the nth comma as a single field no matter how many commas the line has.
For example, say your input file has three columns like this:
boris,14.23,12 the avenue
mark,32.10,flat 1, the grange
percy,1.00,Bleak house, Dartmouth
... so that column three is an address which might contain a comma and is not enclosed in quotes so that IMPORT DELIMITER can't help you.
Something like this would work in that case:
/* ...skipping a lot of definitions here ... */
input from "datafile.csv".
repeat:
import unformatted v-line.
create tt-thing.
assign tt-thing.name = entry(1, v-line, ',')
tt-thing.price = entry(2, v-line, ',')
tt-thing.address = entry(3, v-line, ',').
do v=i = 4 to num-entries(v-line, ','):
tt-thing.address = tt-thing.address
+ ','
+ entry(v-i, v-line, ',').
end.
end.
input close.
I need help with an Excel Export. I'm trying to export a column as text using Progress 4GL. I need numbers in the column which have a leading "0" that excel keeps deleting when opens.
I tried it with using STRING function to make the variable to be String before it goes to export. It did not work. Is there any other way to export with leading 0s?
I assume that you are saving the file in progress as a CSV and when the file is opened in Excel it loses the leading 0.
When outputting the string you can enclose it as follows so that excel reads it in as a string.
put unformatted '="' string("00123") '"'
If you're writing directly to Excel, you can put a ' character at the beginning of the number, and then Excel will interpret it as number formatted with text.
You can see it in action here:
def var ch-excel as com-handle no-undo.
def var ch-wrk as com-handle no-undo.
create "Excel.Application" ch-excel no-error.
ch-excel:visible = no no-error.
ch-excel:DisplayAlerts = no no-error.
ch-wrk = ch-excel:workbooks:add.
ch-excel:cells(1,1) = "'01".
ch-wrk:SaveAs("c:\temp\test.xlsx", 51, "", "", false, false, ) no-error. /* 51 = xlOpenXMLWorkbook */
ch-excel:DisplayAlerts = yes.
ch-excel:quit().
release object ch-wrk.
release object ch-excel.
Since I've be using excel to generate reports for a while, I've create a small lib that generates an excel based on a temp-table definition, and I think it might be helpful, you can check it up at: https://github.com/rodolfoag/4gl-excel
When you import manually into excel select the columns as TEXT and not GENERAL, then the leading zero will not dissapear
You can set the format of the cell, something like this:
h-excel:Range("A12")::numberformat = FILL("0",x).
where x would be the length of the variable you want to insert.
Is there an analagous procedure to php's http://php.net/manual/en/function.mysql-real-escape-string.php for Progress 4GL / ABL or a best practice within the Progress community that is followed for writing sanitized text to external and untrusted entities (web sites, mysql servers and APIs)?
The QUOTE or QUERY-PREPARE functions will not work as they sanitize text for dynamic queries for Progress and not for external entities.
The closest analogue to your cited example would be to write a function that does this:
DEFINE VARIABLE ch-escape-chars AS CHARACTER NO-UNDO.
DEFINE VARIABLE ch-string AS CHARACTER NO-UNDO.
DEFINE VARIABLE i-cnt AS INTEGER NO-UNDO.
DO i-cnt = 1 TO LENGTH(ch-escape-char):
ch-string = REPLACE(ch-string,
SUBSTRING(ch-escape-char, i-cnt, 1),
"~~" + SUBSTRING(ch-escape-char, i-cnt, 1)).
END.
where
ch-escape-chars are the characters you want escape'd.
ch-string is the incoming string.
"~~" is the esacap'd escape character.
It sounds like roll your own would be the only way. For my purposes I emulated the mysql_real_escape_string function
/* TODO progress auto changes all ASC(0) characters to space or ASC(20) in a non db string. */
/* the backslash needs to go first */
/* there is no concept of static vars in progress (non class) so global variables */
DEFINE VARIABLE cEscape AS CHARACTER EXTENT INITIAL [
"~\",
/*"~000",*/
"~n",
"~r",
"'",
"~""
]
.
DEFINE VARIABLE cReplace AS CHARACTER EXTENT INITIAL [
"\\",
/*"\0",*/
"\n",
"\r",
"\'",
'\"'
]
.
FUNCTION mysql_real_escape_string RETURNS CHARACTER (INPUT pcString AS CHAR):
DEF VAR ii AS INTEGER NO-UNDO.
MESSAGE pcString '->'.
DO ii = 1 TO EXTENT(cEscape):
ASSIGN pcString = REPLACE (pcString, cEscape[ii], cReplace[ii]).
END.
MESSAGE pcString.
RETURN pcString.
END.
Is there any 4GL statement used for editing an ASCII files from the disk, if so how?
Editing involves reading a file, probably using IMPORT, then manipulating the text using string functions like REPLACE() and finally writing the result probably using PUT. Something like this:
define stream inFile.
define stream outFile.
define variable txtLine as character no-undo.
input stream inFile from "input.txt".
output stream outFile to "output.txt".
repeat:
import stream inFile unformatted txtLine.
txtLine = replace( txtLine, "abc", "123" ). /* edit something */
put stream outFile unformatted txtLine skip.
end.
input stream inFile close.
output stream outFile close.
Yes there is. You can use a STREAM to do so.
/* Define a new named stream */
DEF STREAM myStream.
/* Define the output location of the stream */
OUTPUT STREAM myStream TO VALUE("c:\text.txt").
/* Write some text into the file */
PUT STREAM myStream UNFORMATTED "Does this work?".
/* Close the stream now that we're done with it */
OUTPUT STREAM myStream CLOSE.
Progress could call operating system editor:
OS-COMMAND("vi /tmp/yoyo.txt").
You could use copy-lob to read and write the file
DEF VAR lContents AS LONGCHAR NO-UNDO.
/* read file */
COPY-LOB FROM FILE "ascii.txt" TO lContents.
/* change Contents, e.g. all capital letters */
lContents = CAPS(lContents).
/* save file */
COPY-LOB lContents TO FILE "ascii.txt".
I Think that for "editing" you mean to be able to read and then show the file in screen and manipulate the file?
If so then here you have an easy one, of course, the size of the file can't be bigger than the max. capacity of a vchar variable:
def var fileline as char format "x(250)". /* or shorter or longer, up to you*/
def var filedit as char.
/*you have to quote it to obtain & line into teh charvar*/
unix silent quoter kk.txt > kk.quoted.
input from kk.quoted no-echo.
repeat:
set fileline.
filedit = filedit + (fileline + chr(13) + chr(10)) .
end.
input close.
update filedit view-as editor size 65 by 10.
Sure you can manage to save the file once edited ;-)