Import CSV into existing realm - swift

Is there a way to import a CSV file to an existing realm? I know realm browser can import CSV but it imports it to a new realm copy which I don't want.
I have an idea to create my own converter class, which will read CSV and dump data into existing realm. But please if there is an easier way to do it, do suggest.
Currently, I have an existing realm in my app at the default location.

Related

Watson Discovery -- Import a Model with a Object Storage connection

I have a SDU Model created and I need import this model in other collection. The documentation explain that is necessary add 1 document and then import the model.
https://cloud.ibm.com/docs/services/discovery?topic=discovery-sdu#import
The problem is that my collection is connected with the object storage service with more than 1000 documents and not is possible add only one document and then import the model
I imported the model but SDU doesn't recognize my model. Is possible import a model with this type of connection?
Thanks.
After doing the initial crawl of Object Storage you should get the option to import the model. If you are getting an error importing please post it here for further debugging. (I am an IBM Watson employee)

How do I update data in magical records

I am using core data in my swift project and I am using magical record for core data I want to edit already saved data.
I am fetching editing task like this:
array = Tasks.MR_findByAttribute("task_name", withValue: entryLabel.text)
I am getting data which I want to edit. I am not understanding how to edit this and save in the place of old record. Can anyone please tell me the syntax.
In MagicalRecord, you modify data the same way you would with plain Core Data:
fetch
change attributes / relationships
save
The most commonly used API in MagicalRecord for this is the class method saveWithBlock. In the block you fetch your entities and modify them.

Core Data: move object from one persistent store to another

I'm using Core Data in my app and would like to export only some of the data and import it on some other device.
To avoid migration issues, I'd like to do the following:
Export:
create a second export.sqlite-file with the same database model, but empty
add that file with addPersistentStoreWithType
copy some ManagedObjects over to that .sqlite
remove the added persistent store
Import:
- copy export.sqlite-file into app
- add that .sqlite-file with addPersistentStoreWithType
- copy data over
- remove added persistentStore
but how to achieve that? i.e. how can I tell my managed object so copy itself into the other store?
how can I tell my managed object so copy itself into the other store?
You can't, not directly anyway. You'll have to do something like:
For each object in the origin data store,
Create a new object in the target store with the same entity type
Assign the new object's attributes to the same values as the original object
Once you're done creating new objects, do a second pass to set up any relationships.
The relationships need to be done separately, because all of the objects in a relationship need to exist before you can create the relationship.

Add imported classes to metadata

All,
I'm importing a group of classes for sqlalchemy from a separate file. They define the tables on my DB (inheriting from declarative_base()), and were originally located in the same file as my engine and metadata creation.
Since I have quite a few tables, and each of them is complex, I don't want them located in the same file I'm using them in. It makes working in the file more unwieldy, and I want a more clear delineation, since the classes document the current schema.
I refactored them to their own file, and suddenly the metadata does not find them automatically. Following this link, I found that it was because my main file declares base:
from tables import address, statements
Base = declarative_base()
metadata = MetaData()
Base.metadata.create_all()
And so does my tables file:
Base = declarative_base()
class address(Base):
...
So, as far as I can tell they get separate "bases" which is why the metadata can't find and create the declared tables. I've done some googling and it looks like this should be possible, but there isn't any obvious way to go about it.
How do I import tables defined in a separate file?
Update:
I tried this, and it sort of functions.
In the tables file, declare a Base for the table classes to import:
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
Then in the main file, import the preexisting base and give its metadata to a new Base:
from sqlalchemy.ext.declarative import declarative_base
from tables import Base as tableBase
Base = declarative_base(metadata=tableBase.metadata)
After some more testing, I've found this approach leaves out important information. I've gone back to one file with everything in it, since that does work correctly. I'll leave the question open in case someone can come up with an answer that does work correctly. Or alternatively, point to the appropriate docs.

Export object from one app to another: XML or Encoding?

I have a fairly complex Core Data database with many entities, attributes and relationships.
I need to take an NSManagedObject subclass object (or its data) and export it to another instance of the app. This other instance needs to then import it into its local database.
I have figured out how to attach files to emails however I am not sure if I should serialise the object to XML or if I should encode it using dictionaries.
It seems like a huge job either way, does anyone have any suggestions?
I dont know much about this but using dictionaries seems a better option to me as you will spare yourself xml writing and parsing, since you are using the same structure why bother with writing xml and parsing.. hope this helps.
update:
see this link.. https://stackoverflow.com/a/1375120/919545