As far as I can tell, react-leaflet, does not support using a coordinate reference system (CRS) other than Leaflet's default (EPSG:3857). Is this the case?
react-leaflet's documentation contains no references to CRS.
From what I see in the source, the Map component passes props through, but it's not obvious to me whether it would work (or whether it's a "good idea"TM) to instantiate a Leaflet.CRS out-of-band and pass that through.
Seems like you've answered your own question. As you note, the props get passed through to Leaflet's map function which does support a custom CRS. It's probably not a good idea unless you know how you will use the custom CRS.
Related
Can anybody explain the usage of EAnnotation in ecore, in terms of its specific fields (i.e.,
Source,
Details,
EModel Element,
Contents, and
References). I looked at its API documentation here, but could not get that much from there.
I am looking for a kind of guideline to explain by example what are the purpose of having annotations with such fields in ecore.
I asked this question in Eclipse EMF forum and here is an answer.
There is also an answer to this question here..
However, I briefly provide an answer to this question:
Generally EAnnotations are used in Ecore to encode any kind of information which is not captured by Ecore at first place. For example, they are used in OCLInECore to hold OCL constraints, or in genmodel to store code generation related information and etc.
Source is used to identify the type of annotation. It is usually populated with a URI, to uniquely identify the type of annotation.
Details is a set of (key,value) pairs to hold detailed information regarding this annotation. Actually this is the place annotation data are really stored.
The rest is (quoted from here):
EModel Element ,
"An EAnnotation is itself an EModelElement, so can also be annotated. It's not
often used, but would allow you to build a complex structure.."
Contents
EAnnotations can contain arbitrary other objects. This is also not
often used, and if you do use it, you can't generate a normal
XyzPackageImpl but must ensure that the GenPackage specifies "Initialize
by Loading"..."
References.
"EAnnotations can refer to arbitrary other objects. This is also not
often used, and the same caveat applies and for the contents..."
I would like to know if there is anyone who has implemented the subjectscheme maps of DITA1.2 in their work? If yes, can you please break-up the example to show:
how to do it?
when not to use it?
I am aware of the theory behind it, but I am yet to implement the same and I wanted to know if there are things I must keep in mind during the planning and implementation phase.
An example is here:
How to use DITA subjectSchemes?
The DITA 1.2 spec also has a good example (3.1.5.1.1).
What you can currently do with subject scheme maps is:
define a taxonomy
bind the taxonomy to a profiling or flagging attribute, so that it the attribute only takes a value that you have defined
filter or flag elements that have a defined value with a DITAVAL file.
Advantage 1: Since you have a taxonomy, filtering a parent value also filters its children, which is convenient.
Advantage 2: You can fully define and thus control the list of values, which prevents tag bloat.
Advantage 3: You can reuse the subject scheme map in many topic maps, in the usual modular DITA way, so you can apply the same taxonomies anywhere.
These appear to be the main uses for a subject scheme map at present.
The only disadvantages I have found is that I can think of other hypothetical uses for subject scheme maps such as faceted browsing, but I don't think any implementation exists. The DITA-OT doesn't have anything like that yet anyway.
I'm currently prototyping a small project in Plone and trying to KISS as much as possible while the requirements are still in flux. To that end, I've resisted creating any custom content types for now and have been using marker interfaces to distinguish between "types" of content.
Now that I'm looking at workflow, I've realised that they're bound to types, and there doesn't seem to be a mechanism for assigning them to markers. I think I could wrap portal_workflow with my own version that looks for markers and returns the appropriate workflow if found, however, this doesn't feel like a tenable approach.
Is there a way of assigning workflow to markers that I've missed, or should I just bite the bullet and create some lightweight custom content types instead?
There's not really a built-in feature to use markers, but at http://www.martinaspeli.net/articles/dcworkflows-hidden-gems, Martin Aspeli hints that it is possible:
Note that in Plone, the workflow chain of an object is looked up by
multi-adapting the object and the workflow to the IWorkflowChain
interface. The adapter factory should return a tuple of string
workflow names (IWorkflowChain is a specialisation of IReadSequence,
i.e. a tuple). The default obviously looks at the mappings in the
portal_workflow tool, but it is possible to override the mapping, e.g.
in resopnse to some marker interface.
I've run into a frustrating feature of KVO: all notifications are funneled through a single method (observeValueForKeyPath:....), requiring a bunch of IF statements if the object is observing numerous properties.
The ideal solution would be to pass a method as an argument to the method that establishes the observing in the first place, but it seems this isn't possible. Does a solution exist to this problem? I initially considered using the keyPath argument (addObserver:forKeyPath:options:context:) to call a method via NSSelectorFromString, but then I came across the post KVO Dispatcher pattern with Method as context and the article it linked to which offers a different solution in order to pass arguments along as well (although I haven't gotten that working yet).
I know a lot of people have come up against this issue. Has a standard way of handling it emerged?
OP asks:
Has a standard way of handling it emerged?
No, not really. There are a lot of different approaches out there. Here are some:
https://github.com/sleroux/KVO-Blocks
http://pandamonia.github.io/BlocksKit
http://www.mikeash.com/pyblog/friday-qa-2012-03-02-key-value-observing-done-right-take-2.html
https://github.com/ReactiveCocoa/ReactiveCocoa
http://blog.andymatuschak.org/post/156229939/kvo-blocks-block-callbacks-for-cocoa-observers
Seriously, there are a ton of these... Google "KVO blocks"
I can't say that any of the options I've seen seem prevalent enough to earn the title "standard way". I suspect most folks who feel motivated to conquer this issue just pick one and go with it, or write their own -- it's not as if adapting KVO to use block based callbacks is rocket science. The Method-based approach you link to doesn't seem like a step forward for simplicity. I get that you're trying to take the uncertainty of the string-based-key-path <-> method conversion out of the equation, but that kind of falls down because not all observable keys/keyPaths are methods. (If nothing else, you can observe arbitrary keys on NSMutableDictionaries and get notifications.)
It sure would be nice if Apple would release a new blocks-based KVO API, but I'm not holding my breath. But in the meantime, like I said, just pick one you like and use it or write your own and use that.
I'm looking for an end to end example using dojo.store with dijit.Tree over REST.
There are many existing examples that use the older dojo api, dojo.data.api, but a dearth of ones using the dojo.store api.
Is the reason that dijit.Tree doesn't fully support dojo.store yet?
If so, do I need to use the dojo.data.ObjectStore wrapper to encapsulate dojo.store for use with dijit.tree?
I saw one example of working around this by extending StoreFileCache:
http://dojo-toolkit.33424.n3.nabble.com/New-object-store-and-dijit-Tree-td2680201.html
Is that the recommended option, or should I
a) stick to dojo.data.api until dijit.Tree supports dojo.store directly, or
b) use the dojo.data.ObjectStore wrapper
Thanks
There is now a tutorial on the DTK website that seems to cover pretty much exactly this topic.
http://staging.dojotoolkit.org/documentation/tutorials/1.6/store_driven_tree/
However, as I know linking to something without giving an answer is considered a poor practice, the general idea is that rather than using a dojo.data.ObjectStore to wrap around it and then potentially shoving it through a ForestStoreModel, you can simply augment your dojo.store-based store to add the methods that the Tree will look for. Here's a simple example from the tutorial:
usGov = new dojo.store.JsonRest({
target:"data/",
mayHaveChildren: function(object){
// see if it has a children property
return "children" in object;
},
getChildren: function(object, onComplete, onError){
// retrieve the full copy of the object
this.get(object.id).then(function(fullObject){
// copy to the original object so it has the children array as well.
object.children = fullObject.children;
// now that full object, we should have an array of children
onComplete(fullObject.children);
}, onError);
},
getRoot: function(onItem, onError){
// get the root object, we will do a get() and callback the result
this.get("root").then(onItem, onError);
},
getLabel: function(object){
// just get the name
return object.name;
}
});
It's worth noting that in this case, we're making some assumptions about what the data looks like. You'd need to know how your children relate and customize the methods below for that purpose, but it's hopefully fairly clear as to how to do that for yourself.
You can also just stick to dojo.data APIs for now, but this approach definitely feels more lightweight. It takes a couple of layers out of the stack and working with customizing a dojo.store-based store is much easier.
Given the two options you outlined, I'd say it's a matter of how well you know the different APIs.
dojo.store is more light-weight, and perhaps easier to understand, but the wrapper adds some overhead. If you think your project will live for a long time, this is probably the best way to go.
dojo.data is a legacy API, which will phased out eventually. If your project is short-lived, and only based on dijit.Tree, this might be your best option for now.
Personally, I'd go with dojo.store and write my own TreeStoreModel to get the best of both worlds. This approach is very similar to Brian's suggestion.
In case you're interested, I've written up a two-series post on how to use dijit.Tree with the ObjectStore wrapper, and implementing a JsonRest backend in PHP.