Compact declared in maple - maple

In maple, I have declared
var := {m[1], m[2], m[3],
m[4], m[5], m[6],
m[7], m[8], m[9],
m[10], m[11], m[12],
m[13], m[14], m[15],
m[16], m[17], m[18],
m[19],m[20], m[21],
m[22], m[23], m[24],
m[25], m[26], m[27],
m[28], m[29], m[30],
m[31], m[32], m[33],
m[34], m[35], m[36],
m[37], m[38], m[39],
m[40]};
But it is too long. How to declare shorter ?

You could use the seq command to create a sequence for the m[i] terms:
var := { seq( m[i], i = 1..40 ) };

Related

Golang SQLBoiler append queries dynamically

I'm trying to dynamically run queries against my Postgres database, but can't fully wrap my head around it.
The solution I'm looking for is one where I can set the query dynamically, perhaps by appending parameters to the final query throughout the code, and then have only one instance of the query being executed.
As mentioned in the title I am using SQLBoiler to interface with Postgres.
Here's what I'm looking for in pseudo code:
final_query := QueryMod{
Where("(mt_mas = ? or mt_mem like ?) and mt_group = ?", uint(uid), `%"`+strconv.Itoa(uid)+`"%`, bool(mt_group_bool)),
}
if a == 1 {
final_query = append(final_query, And(" and mt_important = ?", bool(false)))
} else {
final_query = append(final_query, And(" and mt_ness = ?", bool(true)))
}
res_mt_count, err := models.MTs(
final_query,
).All(CTX, DB)
Thankful for any help along the way! :)
mkopriva solved my problems with the following solution:
type QueryModSlice []qm.QueryMod
func (s QueryModSlice) Apply(q *queries.Query) {
qm.Apply(q, s...)
}
func main() {
mods := QueryModSlice{
qm.Where("(mt_mas = ? or mt_mem like ?) and mt_group = ?", uint(uid), `%"`+strconv.Itoa(uid)+`"%`, bool(mt_group_bool)),
}
if a == 1 {
mods = append(mods, qm.And(" and mt_important = ?", bool(false)))
} else {
mods = append(mods, qm.And(" and mt_ness = ?", bool(true)))
}
res_mt, err := models.MTs(mods).All(CTX, DB)
}
Thanks a bunch! :)

How to generate a random Scala Int in Chisel code?

I am trying to implement the way-prediction technique in the RocketChip core (in-order). For this, I need to access each way separately. So this is how SRAM for tags looks like after modification (separate SRAM for each way)
val tag_arrays = Seq.fill(nWays) { SeqMem(nSets, UInt(width = tECC.width(1 + tagBits)))}
val tag_rdata = Reg(Vec(nWays, UInt(width = tECC.width(1 + tagBits))))
for ((tag_array, i) <- tag_arrays zipWithIndex) {
tag_rdata(i) := tag_array.read(s0_vaddr(untagBits-1,blockOffBits), !refill_done && s0_valid)
}
And I want to access it like
when (refill_done) {
val enc_tag = tECC.encode(Cat(tl_out.d.bits.error, refill_tag))
tag_arrays(repl_way).write(refill_idx, enc_tag)
ccover(tl_out.d.bits.error, "D_ERROR", "I$ D-channel error")
}
Where repl_way is Chisel random UInt generated by LFSR. But Seq element can be accessed only by Scala Int index which causes a compilation error. Then I tried access it like this
when (refill_done) {
val enc_tag = tECC.encode(Cat(tl_out.d.bits.error, refill_tag))
for (i <- 0 until nWays) {
when (repl_way === i.U) {tag_arrays(i).write(refill_idx, enc_tag)}
}
ccover(tl_out.d.bits.error, "D_ERROR", "I$ D-channel error")
}
But assertion arises -
assert(PopCount(s1_tag_hit zip s1_tag_disparity map { case (h, d) => h && !d }) <= 1)
I am trying to modify ICache.scala file. Any ideas on how to do this properly? Thanks!
I think you can just use a Vec here instead of a Seq
val tag_arrays = Vec(nWays, SeqMem(nSets, UInt(width = tECC.width(1 + tagBits))))
The Vec allows indexing with a UInt

CoffeeScript: one liner to map object into another

I have data in the following format:
data = {
car1: {
starting_position: 1,
...
},
car5: {
starting_position: 2,
...
}
}
I want to create an object where starting_position becomes the key and the key in the original data becomes the value. I can do it like this:
byStartingPosition = {}
for k, properties of data
byStartingPosition[properties.starting_position] = k
But I can't imagine there is no one liner to do the same...
If you are using lodash 4.1.0 or later you could do it with this function https://lodash.com/docs#invertBy
_.invertBy data, (v) -> v.starting_position
https://jsfiddle.net/7kf9wn71/2/
You cannot reduce it semantically but you can make it more concise
byStartingPosition = {}
byStartingPosition[v.starting_position] = k for k,v of data
Rayon's comment was aaalmost there. You want to use reduce:
byStartPos = Object.keys(data).reduce(((obj, k) -> start = data[k].starting_position; obj[start] = k; obj), {})
Although that's obnoxiously long, not very idiomatic coffeescript, and frankly less readable than your original, it is a one-liner.

Blob to String : How to convert Blob to string from StoredProcedure Parameter in PostgreSQL?

I have stored procedure (function in Postgres) with type of parameter like this :
Params[0] : result = ftBlob // postgresql function = text
Params[1] : 1 = ftString
Params[2] : 2 = ftInteger
Params[3] : 3 = ftInteger
my code is like this :
procedure TForm1.Button1Click(Sender: TObject);
var
ResultStr: TResultStr;
BlobField: TBlobField;
bStream: TStream;
DataSet: TDataSet;
StoredProc: TSQLStoredProc;
begin
sp01.Close;
sp01.Params[1].AsString := '2010/2011';
sp01.Params[2].AsInteger := 2;
sp01.Params[3].AsInteger := 1;
sp01.ExecProc;
if sp01.ParamByName('result').Value.IsBlob then
begin
BlobField := StoredProc.ParamByName('result') as TBlobField;
bStream := sp01.CreateBlobStream(BlobField, bmRead);
try
bStream.Read(ResultStr,sizeof(TResultStr));
finally
bStream.Free;
end;
end;
ShowMessage(ResultStr.Hasil);
end;
the question is, how do I want to get the result (Blob) become string ?
I don't know what TResultString is, but you can do it with a string:
var
BlobResult: string; // Changed to make clearer where changes were below
begin
// Your other code here
if sp01.ParamByName('result').Value.IsBlob then
begin
BlobField := StoredProc.ParamByName('result') as TBlobField;
bStream := sp01.CreateBlobStream(BlobField, bmRead);
try
SetLength(BlobResult, bStream.Size); // Note changes here
bStream.Read(BlobResult[1], bStream.Size); // and here
finally
bStream.Free;
end;
end;
This is an old post but if anyone needs in the future.
ShowMessage(BlobField.AsString);

specman: Assign multiple struct member in one expression

Hy,
I expanding an existing specman test where some code like this appears:
struct dataset {
!register : int (bits:16);
... other members
}
...
data : list of dataset;
foo : dataset;
gen foo;
foo.register = 0xfe;
... assign other foo members ...
data.push(foo.copy());
is there a way to assign to the members of the struct in one line? like:
foo = { 0xff, ... };
I currently can't think of a direct way of setting all members as you want, but there is a way to initialize variables (I'm not sure if it works on struct members as well). Anyway something like the following may fit for you:
myfunc() is {
var foo : dataset = new dataset with {
.register = 0xff;
.bar = 0xfa;
}
data.push(foo.copy());
}
You can find more information about new with help new struct from the specman prompt.
Hope it helps!
the simple beuty of assigning fields by name is one language feature i've always found usefull , safe to code and readable.
this is how i'd go about it:
struct s {
a : int;
b : string;
c : bit;
};
extend sys {
ex() is {
var s := new s with {.a = 0x0; .b = "zero"; .c = 0;};
};
run() is also {
var s;
gen s keeping {.a == 0x0; .b == "zero"; .c == 0;};
};
};
i even do data.push(new dataset with {.reg = 0xff; bar = 0x0;}); but you may raise the readablity flag if you want.
warning: using unpack() is perfectly correct (see ross's answer), however error prone IMO. i recommend to verify (with code that actually runs) every place you opt to use unpack().
You can directly use the pack and unpack facility of Specman with "physical fields" ( those instance members prefixed with the modifier %).
Example:
define FLOODLES_WIDTH 47;
type floodles_t : uint(bits:FLOODLES_WIDTH);
define FLABNICKERS_WIDTH 28;
type flabnickers_t : uint(bits:FLABNICKERS_WIDTH);
struct foo_s {
%!floodle : floodles_t;
%!flabnicker : flabnickers_t;
};
extend sys {
run() is also {
var f : foo_s = new;
unpack(packing.low,64'hdeadbeefdeadbeef,f);
print f;
unpack(packing.low,64'hacedacedacedaced,f);
print f;
};
setup() is also {
set_config(print,radix,hex);
};
};
When this run, it prints:
Loading /nfs/pdx/home/rbroger1/tmp.e ...
read...parse...update...patch...h code...code...clean...
Doing setup ...
Generating the test using seed 1...
Starting the test ...
Running the test ...
f = foo_s-#0: foo_s of unit: sys
---------------------------------------------- #tmp
0 !%floodle: 0x3eefdeadbeef
1 !%flabnicker: 0x001bd5b
f = foo_s-#0: foo_s of unit: sys
---------------------------------------------- #tmp
0 !%floodle: 0x2cedacedaced
1 !%flabnicker: 0x00159db
Look up packing, unpacking, physical fields, packing.low, packing.high in your Specman docs.
You can still use physical fields even if the struct doesn't map to the DUT. If your struct is already using physical fields for some other purpose then you'll need to pursue some sort of set* method for that struct.