Umbraco: Storing spatial data - lucene.net

I'm researching Umbraco for use as a base in a large CMS project, however the project calls for the SQL Server 2008 database to store spatial data against content.
Being new to Umbraco I'm still reading through the documentation and slowly building up an idea of it's architecture. However so far it doesn't look like Umbraco supports the storage of spatial data.
There only appears to be four database datatype options: date, integer, ntext, nvarchar
Is it possible to store spatial data to the database?
Update: Futher research into how Umbraco works has showed me I was on the wrong track. It seems the way to do this is store the lat/long data in the data inside the usual XML format Umbraco uses.
Then to use the Spatial.net extensions that have been built on top of Lucene.net, rather than use the limited search capabilities Examine exposes.
However this is all still theoretical, I've just not been able to achieve this. If I do before someone answers this question, I'll post my findings here to help others.

You could take a look at how to make user controls (with Visual Studio) in Umbraco.
It is also possible the versatility in Umbraco 'Document Types' is enough for you.
It is possible to extend Umbraco in any sort of way to get the solution you want. I don't know how you want the spatial data to interact with your frontend - so it is difficult to provide a direct solution.

Although there are ways to store spatial data and perform queries against it using Spatial.net, it's not a very elegant solution.
Instead I've created an additional table in SQL Server 2008 with the geometry/geography datatype and a reference to the Umbraco content it's connected with.
I've then got a event hook which updates this whether content is added/updated/deleted.

Related

Displaying web based visualisation or graphing of data based on a postgresql database?

I am working on a web application for a client that uses a postgresql database. I want the client to be able to go to a certain area of the site where the data from the database is displayed in graph form (for example, sales figures over a 6 month period). Is there a plugin I could use for this (I don't have any experience of this, so an easy one, or one with tutorials available would be great). I had a look at BIRT, which says it has a web based option but I couldn't really figure it out. I don't want the client to have to download and go through another program, I just want them to go to a url within their site, and it's all just presented to them there and then.
Any sort of pointers in the right direction would be greatly appreciated.
Thank you.
HighCharts, at http://www.highcharts.com/, works well for this case -- I use it fairly often. It supports Ajax data feeds in JSON format, so you can write an endpoint which returns the JSON representing the data from Postgres and which gets called from a JavaScript function which creates the graphs using that data (you would place that call in a ready function).
Also, if you're using Postgres 9.3 or higher, it supports JSON natively, so you can do the JSON conversion in the SQL query itself, as opposed to post-processing the results in your Python or other backend code.
Highcharts is reasonably flexible and allows for a variety of nice-looking, functional charts and graphs. If you want to get much fanicer, d3 may be worth a look. These are some the types of graphs/charts it can do: https://github.com/mbostock/d3/wiki/Gallery
I have not used d3 myself, however.
For the scenario you described above, Highcharts seems like it would work just fine.
It's been a while, and a lot has happened since 2016. There is now ChartJS as well - http://chartjs.org/, for example, which is easier to use than HighCharts and very flexible (I've used both).
What they both don't do is dynamic data. If you want that your client decides which data he wants to watch - that part you need to write yourself.

Creating Metadata Catalog in Marklogic

I am trying to combined data from multiple sources like RDBMS, xml files, web services using Marklogic. For this as I see from MarkLogic documentation on Metadata Catalog (https://www.marklogic.com/solutions/metadata-catalog/), Data Virtualization (https://www.marklogic.com/solutions/data-virtualization/) and Data Unification it is very well possible. But I am not able to get hold of any documentation describing how exactly to go about it or which tools to use to achieve this.
Looking for some pointers.
As the second image in the data-virtualization link shows, you need to ingest all data into MarkLogic databases. MarkLogic can then be put in between to become the single entry point for end user applications that need access to that data.
The first link describes the capabilities of MarkLogic to hold all kinds of data. It partly does so by storing them as-is, partly by extracting text and metadata for searching, partly by conversion (if you needs go beyond what the original format allows).
MarkLogic provides the general purpose MarkLogic Content Pump (MLCP) tool for this purpose. It allows ingesting zipped or unzipped files, and applying transformations if necessary. If you need to retrieve your data from a different database, you might need a bit more work to get that out. http://developer.marklogic.com holds tutorials, blogs, and tools that should help you get going. Searching the MarkLogic Mailing List through http://marklogic.markmail.org/ can provide answers as well.
HTH!
Combining a lot of data is a very broad topic. Can you describe a couple types of data you'd like to integrate, and what services or queries you would like to build on that data?

RavenDB and Inserting data

I recently started using RavenDb. I am converting a relational dbase to use RavenDb. I have two simple tables in the Relational dbase:
tbStates
tbCities
I have all US cities linked to a state. How can I go about converting this to no-sql. Will I have to write a little application to read from the relational dbase and create the objects? Or are there some tools out there I can use to do this?
There is a utility called smuger http://ravendb.net/documentation/smuggler but I imagine you will have to convert your data to Json. It may be just as easy to write a console app that reads the tables to objects then loads to Raven.
Just to add I migrated a SQL Server database to RavenDB using the console application route.
I used EF to quickly pull out the data and converted it to my RavenDB domain then added it to RavenDB.
It Worked well as you will most likely want to tweak the domain anyway to work best with RavenDB (For example I had an Images SQL table that I turned into a List on the document etc).
See Ayende's RacoonBlog project on github (https://github.com/ayende/RaccoonBlog) as he does something similar to move subtext data to RavenDB. RacoonBlog is the engine powering his blog and makes for good learning material about how to use RavenDB.

Which Framework or CMS for Google-Video like site?

I am working on a Web Project similar to Google-Video.
As for now, I want to start coding the site.
I know some PHP, HTML and MySQL.
I already have:
Database built and ready (in MySQL)
Links and Tags in the Database
The thing is, I don't want to code everything from hand.
As I've seen so far, with CMS it's not possible to use my own database. Or am I wrong?
And what Framework would you suggest me?
Looking forward for your advice!
Thanks
You should probably start over, but use your existing DB design as your logical schema to be implemented in the CMS you eventually choose.
Go to http://cmsmatrix.org/ and compare Drupal, Joomla!, eZ Publish and TYPO3 for the best fit for your requirements.
Also, pay attention to the search engine features available with each one. e.g. eZ Publish eZ Find is based on Lucene.
In terms of functionality ( but excluding add management and your specific layout or graphic-design) you should be able to create a reasonable clone within a few hours using eZ. Here is one example http://untoldstories.eu/ezinfo/about

JasperReports and custom data sources

I'm looking at embedding JasperReports into an existing web app for reporting. The webapp sits on top of an existing database which is ancient and complex, and really not suitable for report writers to use to write reports against directly.
What I want to look at is writing some kind of wrapper around our existing data access layer (written to make our life easier talking to aforementioned ancient and complex db). Does anyone have any experience of writing custom data sources for JasperResports, or of doing anything like this?
Updated
I guess I probably wasn't clear in my question - which is probably because my requirements aren't clear either. I want to provide some way that the end-users can use something like iReport to author reports against the database, and then to use JasperReportServer for scheduling/viewing of the reports. However, the database is really, really nasty and was never designed for use in this way. We've got a access layer around it that the webapp uses to talk to it. I want to keep my end users away from the DB altogether, and the idea of a custom data source that used the access layer seemed a good option. However, I've found very little documentation on how to do that. Maybe it's just a whole lot easier than I think it is, and I'm just trying to make a dead simple thing too complicated.
Updated
Thanks for the answers. I don't think my problem has been solved, but I think the answers have helped to inform the requirements phase.
Jasper reports allows you to use a "JavaBean" data source. You can load your data into any Java Bean structure and build the reports against that. Works well.
See the "Custom Data Source" section here.
Every JasperReports template can have two different data sources. One is hooking it directly to a database using some jdbc driver or, in your case, providing a collection of java beans (POJO's), usually list.
JasperReports template is similar to a method definition. It has a name, i.e. compiled JR object and parameters (data source and a list of input parameters of some of the most popular Java types).
My suggestion is to use iReport tool. Open some example that comes with the JasperReports bundle, analyze it and tweak it. It's not so complicated.
UPDATE
Letting customers authoring JasperReports templates, compiling and adding to the classpath means that you'll need to open your system too much. Usually clients provide description of a desired report and developer(s) create the data source and design the template. JasperReports can have parameters. If these parameters are exposed through UI users can change the behavior of reports in the runtime.
If you really need to allow more flexibility then use the API provided by JasperReports for authoring templates. I could imagine some simple DLS for advanced users to communicate with your system creating on-fly reports.