Progress 4GL for each and select * from cust - progress-4gl

I often do the following progress 4GL code
output to /OUTText.txt.
def var dRow as char.
dRow = "cmpid|CustNum|Cur".
put unformatted dRow skip.
for each Cust no-lock:
dRow = subst("&1|&2|&3", Cust.CmpId, Cust.CustNum, Cust.Curr).
put unformatted dRow skip.
end.
output close.
in order to mimic
select * from cust (in MS SQL)
my question is is there a way to make this block of code, even closely resemblance "select *" using 4GL. Such that I don't have to type each column name and it will print all values in all columns. my thinking is. something like this.
output to /OUTText.txt.
def var dRow as char.
dRow = "cmpid|CustNum|Cur".
put unformatted dRow skip.
for each Cust no-lock:
if row = 1 then do:
for each Column in Cust:
**'PRINT THE COLUMN HEADER**
end.
end.
else do:
**'PRINT EACH CELL**
end.
end.
output close.
If there is such thing. then I don't have to keep explicit column name in dRow.

You can do what you're after if you first output all field labels (or names) and then use EXPORT to output the table content.
To change to field name instead of label: change :LABEL below to :NAME
For instance:
DEFINE VARIABLE i AS INTEGER NO-UNDO.
OUTPUT TO c:\temp\somefile.txt.
DO i = 1 TO BUFFER Customer:NUM-FIELDS.
PUT QUOTER(BUFFER Customer:BUFFER-FIELD(i):LABEL).
IF i < BUFFER Customer:NUM-FIELDS THEN
PUT UNFORMATTED ";".
ELSE IF i = BUFFER Customer:NUM-FIELDS THEN
PUT SKIP.
END.
FOR EACH Customer NO-LOCK:
EXPORT DELIMITER ";" Customer.
END.
OUTPUT CLOSE.
You could put the header part in a separate program to call dynamically every time you want to do something similar:
DEFINE STREAM str.
OUTPUT STREAM str TO c:\temp\somefile.txt.
RUN putHeaders.p(INPUT BUFFER Customer:HANDLE, INPUT ";", INPUT STREAM str:HANDLE).
FOR EACH Customer NO-LOCK:
EXPORT STREAM str DELIMITER ";" Customer.
END.
OUTPUT STREAM str CLOSE.
putHeaders.p
============
DEFINE INPUT PARAMETER phBufferHandle AS HANDLE NO-UNDO.
DEFINE INPUT PARAMETER pcDelimiter AS CHARACTER NO-UNDO.
DEFINE INPUT PARAMETER phStreamHandle AS HANDLE NO-UNDO.
DEFINE VARIABLE i AS INTEGER NO-UNDO.
DO i = 1 TO phBufferHandle:NUM-FIELDS.
PUT STREAM-HANDLE phStreamHandle UNFORMATTED QUOTER(phBufferHandle:BUFFER-FIELD(i):LABEL).
IF i < phBufferHandle:NUM-FIELDS THEN
PUT STREAM-HANDLE phStreamHandle UNFORMATTED pcDelimiter.
ELSE IF i = phBufferHandle:NUM-FIELDS THEN
PUT STREAM-HANDLE phStreamHandle SKIP.
END.

output to "somefile".
for each customer no-lock:
display customer.
end.
I wouldn't generally mention this as the embedded SQL-89 within the 4GL is the highway to hell (that dialect of SQL only works for the most basic and trivial of purposes and really shouldn't be used at all in production code), but as it happens:
output to "somefile".
select * from customer.
does just happen to work to the spec of the original question (although, like the DISPLAY solution, it also does not support a delimiter...)

Related

How to retrieve maillist into one field?

I want to retrieve the mail list into one field and display the data retrieved.
The mail list is represented as an array, and I want all data to be in one field.
define variable i as integer no-undo.
define variable cmmt as longchar no-undo.
cmmt = " ".
for each cd_det no-lock
where cd_ref = "test1"
and cd_type = "EL":
do i = 1 to extent(cd_cmmt):
cmmt = cmmt + cd_cmmt[i].
end.
disp cmmt.
end.
I tried the above code, but it doesn't display. Instead, test1 record contains 2 mails (gangadhar.pichika-external#gemalto.com,balkrishna.talapalliwar-
external#gemalto.com), but I didn’t get that data in cmmt.
A LONGCHAR variable cannot be displayed (unless using a large editor widget). Try to DISPLAY STRING(cmmt).

Removing Unwanted commas from a csv

I'm writing a program in Progress, OpenEdge, ABL, and whatever else it's known as.
I have a CSV file that is delimited by commas. However, there is a "gift message" field, and users enter messages with "commas", so now my program will see additional entries because of those bad commas.
The CSV fields are not in double qoutes so I CAN NOT just use my main method with is
/** this next block of code will remove all unwanted commas from the data. **/
if v-line-cnt > 1 then /** we won't run this against the headers. Otherwise thhey will get deleted **/
assign
v-data = replace(v-data,'","',"\t") /** Here is a special technique to replace the comma delim wiht a tab **/
v-data = replace(v-data,','," ") /** now that we removed the comma delim above, we can remove all nuisance commas **/
v-data = replace(v-data,"\t",'","'). /** all nuisance commas are gone, we turn the tabs back to commas. **/
Any advice?
edit:
From Progress, I cal call Linux commands. So I should be able to execute C++/PHP/Shell etc all from my Progress Program. I look forward to advice, until then I shall look into using external scripts.
You are not providing quite enough data for a perfect answer but given what you say I think the IMPORT statement should handle this automatically.
In my example here commaimport.csv is a comma-separated csv-file with quotes around text fields. Integers, logical variables etc have no quotes. The last field contains a comma in one line:
commaimport.csv
=======================
"Id1", 123, NO, "This is a message"
"Id2", 124, YES, "This is a another message, with a comma"
"Id3", 323, NO, "This is a another message without a comma"
To import this file I define a temp-table matching the file layout and use the IMPORT statement with comma as delimiter:
DEFINE TEMP-TABLE ttImport NO-UNDO
FIELD field1 AS CHARACTER FORMAT "xxx"
FIELD field2 AS INTEGER FORMAT "zz9"
FIELD field3 AS LOGICAL
FIELD field4 AS CHARACTER FORMAT "x(50)".
INPUT FROM VALUE("c:\temp\commaimport.csv").
REPEAT :
CREATE ttImport.
IMPORT DELIMITER "," ttImport.
END.
INPUT CLOSE.
FOR EACH ttImport:
DISPLAY ttImport.
END.
You don't have to import into a temp-table. You could import into variables instead.
DEFINE VARIABLE c AS CHARACTER NO-UNDO FORMAT "xxx".
DEFINE VARIABLE i AS INTEGER NO-UNDO FORMAT "zz9".
DEFINE VARIABLE l AS LOGICAL NO-UNDO.
DEFINE VARIABLE d AS CHARACTER NO-UNDO FORMAT "x(50)".
INPUT FROM VALUE("c:\temp\commaimport.csv").
REPEAT :
IMPORT DELIMITER "," c i l d.
DISP c i l d.
END.
INPUT CLOSE.
This will render basically the same output:
You don't show what your data file looks like. But if the problematic field is the last one, and there are no quotes, then your best bet is probably to read it using INPUT UNFORMATTED to get it a line at a time, and then split the line into fields using ENTRY(). That way you can treat everything after the nth comma as a single field no matter how many commas the line has.
For example, say your input file has three columns like this:
boris,14.23,12 the avenue
mark,32.10,flat 1, the grange
percy,1.00,Bleak house, Dartmouth
... so that column three is an address which might contain a comma and is not enclosed in quotes so that IMPORT DELIMITER can't help you.
Something like this would work in that case:
/* ...skipping a lot of definitions here ... */
input from "datafile.csv".
repeat:
import unformatted v-line.
create tt-thing.
assign tt-thing.name = entry(1, v-line, ',')
tt-thing.price = entry(2, v-line, ',')
tt-thing.address = entry(3, v-line, ',').
do v=i = 4 to num-entries(v-line, ','):
tt-thing.address = tt-thing.address
+ ','
+ entry(v-i, v-line, ',').
end.
end.
input close.

Excel Export as Text Using Progress 4GL

I need help with an Excel Export. I'm trying to export a column as text using Progress 4GL. I need numbers in the column which have a leading "0" that excel keeps deleting when opens.
I tried it with using STRING function to make the variable to be String before it goes to export. It did not work. Is there any other way to export with leading 0s?
I assume that you are saving the file in progress as a CSV and when the file is opened in Excel it loses the leading 0.
When outputting the string you can enclose it as follows so that excel reads it in as a string.
put unformatted '="' string("00123") '"'
If you're writing directly to Excel, you can put a ' character at the beginning of the number, and then Excel will interpret it as number formatted with text.
You can see it in action here:
def var ch-excel as com-handle no-undo.
def var ch-wrk as com-handle no-undo.
create "Excel.Application" ch-excel no-error.
ch-excel:visible = no no-error.
ch-excel:DisplayAlerts = no no-error.
ch-wrk = ch-excel:workbooks:add.
ch-excel:cells(1,1) = "'01".
ch-wrk:SaveAs("c:\temp\test.xlsx", 51, "", "", false, false, ) no-error. /* 51 = xlOpenXMLWorkbook */
ch-excel:DisplayAlerts = yes.
ch-excel:quit().
release object ch-wrk.
release object ch-excel.
Since I've be using excel to generate reports for a while, I've create a small lib that generates an excel based on a temp-table definition, and I think it might be helpful, you can check it up at: https://github.com/rodolfoag/4gl-excel
When you import manually into excel select the columns as TEXT and not GENERAL, then the leading zero will not dissapear
You can set the format of the cell, something like this:
h-excel:Range("A12")::numberformat = FILL("0",x).
where x would be the length of the variable you want to insert.

Progress OpenEdge - TempTable to file - an easy way?

I would like an really easy way to see the content in any temptable in Progress?
You could also for instance produce XML
tthTmp:WRITE-XML("FILE","c:\temp\tt.xml", TRUE).
or (maybe not quite as easy) output as a semi colon delimited file
OUTPUT TO c:\temp\file.txt.
FOR EACH ttTmp:
EXPORT DELIMITER ";" ttTmp.
END.
OUTPUT CLOSE.
I just found out an easy way to dump a temp-table to file using Json (from 10.2B).
WRITE-JSON is the trick!!
DEFINE TEMP-TABLE ttTmp
FIELD FieldA AS CHAR
FIELD FieldB AS CHAR.
CREATE ttTmp.
ASSIGN ttTmp.FieldA = "A"
ttTmp.FieldB = "B".
DEFINE VARIABLE tthTmp AS HANDLE NO-UNDO. /* Handle to temptable */
DEFINE VARIABLE lReturnValue AS LOGICAL NO-UNDO.
tthTmp = TEMP-TABLE ttTmp:HANDLE.
lReturnValue = tthTmp:WRITE-JSON("FILE", "c:\temp\tthTmp.txt", TRUE, ?).
/* Output File tthTmp.txt
{"ttTmp": [
{
"FieldA": "A",
"FieldB": "B"
}
]}
Output File tthTmp.txt */

I am using Progress 4gl and trying to dynamically change a column-label, is this possible

I tried the below code :
def temp-table tt-dg1
field dtoday as date column-label "dg "
.
buffer tt-dg1:BUFFER-FIELD("dtoday"):
column-LABEL = buffer tt-dg1:BUFFER-FIELD("dtoday"):column-LABEL + "77".
display buffer tt-dg1:BUFFER-FIELD("dtoday"):column-LABEL.
create tt-dg1.
dtoday = today.
display tt-dg1 with frame f2.
Expecting field dtoday to now have a column-label of dg 77 but it's still dg, I need this to add week numbers to the standard column-labels of a spreadsheet I am creating.
Any help gratefully receieved :)
This feels like a fault.
It does not appear to work when overriding it on the temp-table.
If you define your field in a frame before display then you can overide it there.
form tt-dg1.dtoday with frame f2.
tt-dg1.dtoday:label = "MyLabel".
display tt-dg1.dtoday with frame f2.
That may or not help depending on what you are doing.
Is it possible to dynamically create the temp table? if so you can dynamically set it there
DEFINE VARIABLE ttDynTable AS HANDLE NO-UNDO.
DEFINE VARIABLE vInt AS INTEGER NO-UNDO INIT 77.
CREATE TEMP-TABLE ttDyntable.
ttDynTable:ADD-NEW-FIELD('dtoday', 'DATE', 0, "99/99/9999",?,"","dg " + STRING(vInt)).
ttDynTable:TEMP-TABLE-PREPARE("tt-dg1").
ttTTHandle = ttDyntable:DEFAULT-BUFFER-HANDLE.
ttTTHandle:BUFFER-CREATE.
ttTTHandle::dtoday = TODAY.
DISPLAY ttTTHandle:buffer-field('dtoday'):column-label ttTTHandle::dtoday.
if not you can just pull the column-label from the buffer instead
DEFINE TEMP-TABLE tt-dg1 FIELD dtoday AS DATE COLUMN-LABEL "dg ".
DEFINE VARIABLE vTTHandle AS HANDLE NO-UNDO.
CREATE tt-dg1.
dtoday = TODAY.
vTTHandle = BUFFER tt-dg1:HANDLE.
vTTHandle:BUFFER-FIELD("dtoday"):column-LABEL = vTTHandle:BUFFER-FIELD("dtoday"):column-LABEL + "77".
DISPLAY vTTHandle:BUFFER-FIELD('dtoday'):COLUMN-LABEL.