Storing a List in a Blob with sqflite - flutter

I am looking for an example of storing a generic Dart object (typically a List<MyObject>) in a Blob column with sqflite (and retrieving it back please) assuming it will store it as binary data.
MyObject is an object used by my application, its content is not important here as I am looking for a generic/universal method.
Could anybody be kind enough to provide a snippet?
Documentation on this subject is rare with Dart. Closest I could find would include marshaling the object as Uint8List something like this or something like that but I could not really understand/apply them.
I have found this similar issue, where the developer ends up converting his lists as a string... We really cannot do better (ie more efficient) ?
Many thanks for your help.
Patrick

You need to convert MyObject into a serialized stream of bytes using BytesBuilder() and then insert into a sqflite BLOB as an Uint8List.
For a List there are several strategies:
If the size of MyObject is the same for every element in the list,
just store them one after another.
If the size is variable, store
first the number of bytes for each one and then the object itself,
repeating for all elements on the list.
Or store each MyObject
followed by a separator. The separator must be chosen to be unique:
no separator shall be part of any MyObject instances.
Normally, the second option is more robust and flexible. Example:
MyList = getTheListOfMyObject();
final bytes = BytesBuilder();
for(int i = 0; i < myList.length; i++) {
bytes.addByte(sizeOfMyObject[i]);
// Here starts the serialization of MyObject
// only you know how it is structured, but
// every member could be expressed as series of bytes
// and remember to think on how to deserialize them
// in order to recover MyObject from the sqflite BLOB
bytes.addByte(firstByteOfMyObject[i]);
bytes.addByte(secondByteOfMyObject[i]);
bytes.addByte(thirdByteOfMyObject[i]);
// and so on.
}
// Then retrieve the full stream as a Uint8List:
final blob = bytes.toBytes();
Map<String, dynamic> map = {
'id': 1,
'my_list': blob,
};
res = await _db.insert("my_table", map);
The table for the example: "CREATE TABLE my_table(id INTEGER, my_list BLOB)"
It is a better coding practice to implement the serialization/deserialization of MyObject inside of MyObject class. The database routines shall retrieve the Uint8List from a getter before storing it and shall pass the bytes to a setter for the class to reconstruct the object structure. That way you keep the implementation details for the object properly encapsulated.

Related

How to copy a Map in dart without getting Reference of the old one

I have a map that i get from my service and i want to copy it into 2 new Maps without being referenced and every time i change a value into one, value change to other too.
This is the Initial Map from my service config.UserGroups and i copy it like this
SavedSettings.UserGroups = new Map.from(config.UserGroups);
UnSavedSettings.UserGroups = new Map.from(config.UserGroups);
This Map is dynamic but it has String,object
Do we have an easy way to bypass reference;
What you are asking for is called deep copying and is something you need to implement yourself for your data structures since List and Map just contains references to objects. So when you are making a copy of a List or Map, you are copying the references and not the objects inside the collection.
So the correct solution would be to make some logic which allows you to make copies of UserGroup objects and put these into your new list.
But.... if you are not scared of hacky solutions....
Section with hacky solution
If you really want to have some way to get deep copy from the Dart language (and don't care if the solution is kinda hacky) it is in theory possible by make use of the idea that we can send objects to isolates which are deep copied if possible.
So we can create an ReceivePort and send our object to our self. An example of this can be seen in this example:
class MyObject {
int value;
MyObject(this.value);
#override
String toString() => 'MyObject($value)';
}
Future<void> main() async {
final list1 = [MyObject(5)];
final list2 = await hackyDeepCopy(list1);
list2[0].value = 10;
print(list1); // [MyObject(5)]
print(list2); // [MyObject(10)]
}
Future<T> hackyDeepCopy<T>(T object) async =>
await (ReceivePort()..sendPort.send(object)).first as T;
This "solution"/hack comes with some limitations when it comes to data we can copy:
The content of message can be:
Null
bool
int
double
String
List or Map (whose elements are any of these)
TransferableTypedData
SendPort
Capability
In the special circumstances when two isolates share the same code and
are running in the same process (e.g. isolates created via
Isolate.spawn), it is also possible to send object instances (which
would be copied in the process). This is currently only supported by
the Dart Native platform.
https://api.dart.dev/stable/2.14.4/dart-isolate/SendPort/send.html
It is the "In the special circumstances when two isolates share the same code and are running in the same process" part we makes use of in hackyDeepCopy which allows us to also deep copy custom made objects.
Another limitation is that hackyDeepCopy needs to be async which is kinda annoying if your code is not already async.
And lastly, this only really works on native platforms (so no Dart-running-as JavaScript stuff).

How do I get `Data` objects into Swift-NIO without making copies?

I'm fairly new to Swift and very new to NIO.
I'm adding Swift code to a large project that needs to up/down load a lot of data (GBs) to AWS. To that end, I've imported the GitHub project Soto, which relies heavily on NIO.
Most methods that send/receive data do so through ByteBuffer structs. My application already has the data to upload in Foundation Data objects. I'm having trouble figuring out the best way to get these Data objects into NIO.
In the documentation for NIO's ByteBuffer (2.26.0) it states
Supported types:
A variety of types can be read/written from/to a ByteBuffer. ... Out of the box, ByteBuffer supports for example the following types (non-exhaustive list):
String/StaticString
Swift’s various (unsigned) integer types
Foundation‘s Data
[UInt8] and generally any Collection of UInt8
However, the latest swift-nil package has no ByteBuffer support for Foundation Data objects. Instead, it supports DispatchData objects, which in turn seem to have no interoperability with Data objects.
What I want to avoid is making a copy of every block of data (100's of MB at a time), just to convert between Data and DispatchData types.
So...
Right now my thinking is one of
I'm completely lost, and there's a simple solution I haven't found
The solution is to create a subclass of DispatchData backed by a Data object
Initialize the ByteBuffer structure using a DispatchData created using the no-copy initializer pointing to the raw byte array in the Data object, along with a custom deallocator that simply retains the Data object until the ByteBuffer and DispatchData objects are destroyed.
I would appreciate any thoughts, experience, or suggestions (particularly if it's option #1).
You'll need to import NIOFoundationCompat to get any of NIO's method that work with Foundation data types such as Data (or JSONDecoder/JSONEncoder). NIOFoundationCompat is just another module of the swift-nio package so you won't need another dependency.
But just to be clear, under the hood, there will always be copies but probably you don't need to worry about them, copies are extremely fast on today's CPUs. If you absolutely want to avoid copies, you'll need to create ByteBuffers straight away. To help you with that, you may want to add where you get your data from that you want to send over the network.
If you are concerned about memory usage and are uploading large buffers perhaps you should be using AWSPayload.stream. This allows you to stream small ByteBuffers to AWS. Here is an example of streaming Data to S3 in 16k chunks
func uploadData( _ data: Data) -> EventLoopFuture<S3.PutObjectOutput> {
var index = 0
let payload = AWSPayload.stream { eventLoop in
let maxChunkSize = 16*1024
let size = min(maxChunkSize, data.count - index)
// are we done yet
if size == 0 {
return eventLoop.makeSucceededFuture(.end)
} else {
// create bytebuffer and return
let byteBuffer = ByteBufferAllocator().buffer(data: data[index..<(index+size)])
index += size
return eventLoop.makeSucceededFuture(.byteBuffer(byteBuffer))
}
}
let putRequest = S3.PutObjectRequest(body: payload, bucket: name, key: "tempfile")
return s3.putObject(putRequest)
}

How to make copy of array's elements in the dart

Getting wired issue like main array has been changed if changed value of another array. I think issue is about copying same address, not sure but just thinking of it. I have tried from last 3 hours but unable to get rid from it.
Look at below illustration to get better idea.
List<page> _pageList;
List<page> _orderList = [];
_pageList = _apiResponse.result as List<page>;
_orderList.add(_pageList[0].element);
_orderList[0].title = "XYZ"
//--> Here if I change the `_orderList[0].title` then it also change the `title` inside "_pageList"
How can we prevent the changes in main array?
I got same issue in my one of the project. What I have done is, use json to encode and decode object that help you to make copy of the object so that will not be affected to the main List.
After 3rd line of your code, make changes like below
Elements copyObject = Elements.fromJson(_pageList[0].element.toJson());
// First of all you have to convert your object to the Map and and map to original object like above
_orderList.add(copyObject);
Hope that will help you.
You can use a getter function to create a copy of your list and use that instead of
altering your actual list.
example:
List<Page> get orderList{
return [..._orderList];
}
Lists in Dart store references for complex types, so this is intended behaviour.
From your code:
_orderList.add(_pageList[0].element);
_orderList[0] and _pageList[0].element point to the same reference (if they are non-primitive).
There is no general copy() or clone() method in dart, as far as i know. So you need to copy the object yourself, if you want a separate instance. (see this question)

Lua light userdata

I have a problem with Lua and I don't know if I going in the right direction. In C++ I have a dictionary that I use to pass parameter to a resource manager. This dictionary is really similar to a map of hash and string.
In Lua I want to access to these resource so I need a representation of hashes. Also hashes must be unique cause are used as index in a table. Our hash function is 64bit and I'm working on 32bit enviroment (PS3).
C++ I have somethings like that:
paramMap.insert(std::make_pair(hash64("vehicleId"), std::string("004")));
resourceManager.createResource(ResourceType("Car"), paramMap);
In Lua want use these resources to create a factory for other userdata.
I do stuff like:
function findBike(carId)
bikeParam = { vehicleId = carId }
return ResourceManager.findResource('car', bikeParam)
end
So, sometime parameter are created by Lua, sometime parameter are created by C++.
Cause my hashkey ('vehicleId') is an index of a table it need to be unique.
I have used lightuserdata to implement uint64_t, but cause I'm in a 32bit enviroment I can't simply store int64 in pointer. :(
I have to create a table to store all int64 used by the program and save a reference in userdata.
void pushUInt64(lua_State *L, GEM::GUInt64 v)
{
Int64Ref::Handle handle = Int64Ref::getInstance().allocateSlot(v);
lua_pushlightuserdata(L, reinterpret_cast<void*>(handle));
luaL_setmetatable(L, s_UInt64LuaName);
}
but userdata are never garbage collected. Then my int64 are never released and my table will grow forever.
Also lightuserdata don't keep reference to metadata so they interfere with other light userdata. Checking the implementation the metadata table is added in L->G_->mt_[2].
doing that
a = createLightUserDataType1()
b = createLightUserDataType2()
a:someFunction()
will use the metatable of b.
I thought that metatable where bounded to type.
I'm pretty confused, with the current implementation lightuserdata have a really limited use case.
With Python you have a hash metafunction that is called anytime the type is used as index for a dictionary. It's possible to do something similar?
Sorry for my english, I'm from Italy. :-/

Drools Guvnor data enumeration API

In Guvnor documentation, I know how to define data enumeration and use it in Guvnor. Is it possible to fetch data enumeration from my own Java code?
From Guvnor's documentation:
Loading enums programmatically: In some cases, people may want to load their enumeration data entirely from external data source (such as a relational database). To do this, you can implement a class that returns a Map. The key of the map is a string (which is the Fact.field name as shown above), and the value is a java.util.List of Strings.
public class SampleDataSource2 {
public Map<String>, List<String> loadData() {
Map data = new HashMap();
List d = new ArrayList();
d.add("value1");
d.add("value2");
data.put("Fact.field", d);
return data;
}
}
And in the enumeration in the BRMS, you put:
=(new SampleDataSource2()).loadData()
The "=" tells it to load the data by executing your code.
Best Regards,
I hope its not too late to answer this.
To load enum from application to guvnor,
Build the enum class dynamically from string (in my case enum values is provided by user via GUI)
Add it to a jar, convert it to byte array
POST it to guvnor as asset (model jar) via REST call
Call save repository operation (change in source code of guvnor)
Now enum will be visible as fact in your rule window.
Editing/Deletion of model jar and validation of rules aftermath is something you have to take care.