Merging Quickbooks (QBW) files with generated report - merge

I have a customer who accidently wrote about 3 megs of data to the wrong quickbooks file. They had a backup in the same folder for reasons unknown... however their accountant still was writing to the old file. Now we have like a 3 meg difference between 2 250~ meg QB files and I need to figure out how to merge these files (which quickbooks does not support) and generate some sort of report so that they can get their accounting info semi straightened out in some sort of organised fasion. Any help would be appreciated. Thank you for taking the time to read this.
(EDIT) - explanation for last few sentences above... They have conflicting invoice numbers and possibly other things due to last level of use of each file.

Karl Irvin has a Data Transfer Utility that can be used to transfer transactions and list items between QBW files. www.q2.us - his tools are widely used and very reliable.
He also has a report combiner tool, if all you want to do is see reports that are taken from data in two files.
QQube (www.clearify.com) can also generate reports from multiple QBW files.

Related

Multiple users working remotely on Tally ERP9

I am not very sure whether this is the right forum to ask this question or not.
We are having a TallyERP9 server with Multiple Licenses. Now our 3 users working remotely on the same Data. We have set up Google Drive for Data Syncing. But most of the time its giving issue due to synchronisation process.
What could be the best soltion so that multiple users can work on same data from Remote Locations?
This is the answer - http://mirror.tallysolutions.com/Downloads/TallyTips/GettingStartedwithDataSynchronisation.pdf
Thanks to #MitaleeRao...
Edit placed here for brevity:
These are 2 points I've noted regarding the Tally architecture:
The database is a flat file in a tree structure, and there are numerous checkpoints at each level for maintaining this inheritance (for e.g., a voucher has inventory entries that have stock items, which have units, etc.).
The SOAP XML protocol that Tally uses does not have multi-threading capabilities - i.e., the Tally server will only accept one request and give a response at a time.
The Data syncronisation that Tally has introduced is probably the automating of exporting the XML of all masters/vouchers and importing them onto the central Database (whether on the Tally.NET server or on a local computer with a static IP). Not sure how the Google Drive client works, but I'm assuming it is a variation of the same (i.e., XML based data export and then import onto a main computer).

Exporting individual Congos Reports via command line

I'm trying to work out how I can export individual Cognos Reports via the command line, for the purposes of source versioning in Git at a report-by-report level. I presume XML would be the output format.
I read that the Cognos SDK can help but you need to build your own solution, which may be possible but this use case feels like something many others would already want and there'd be tooling already.
Of course, importing the individual report would also be needed.
Can anyone help here please?
Thanks.
If your end game is version control (Who changed what, when?), you should look into MotioCI. Last time I looked, there was no free version of MotioCI.
You can use tools like the ones provided by companies like http://www.motio.com. With the free version you can export the XML of the reports but only one by one.
You can also use a Cognos deployment of the reports that generates a zip file with the XML of the reports, but all the reports are in the same file and you will have to extract the XML of the individual reports by hand.
I found the SDK to be cumbersome and, when I got it working, slow.
Yes, report specs are XML.
I have created a process that produces output like what you are asking for. Here's what it involves:
A recursive common table expression (CTE) query to get the report
specs along with the folder structure as seen in Cognos.
A PowerShell script to run the query and write the results to the file system.
Another PowerShell script to pull the current content from the remote git repo, run the first PowerShell script, then add, commit, and push the results up to the remote git repo.
I also wrote a PowerShell script to perform the operations associated with git push. This involves using a program I found called HTML Tidy (http://tidy.sourceforge.net/) that can be used to make the XML human-readable. This helps with diffs in git. I use TFS, so I get a nice, side-by-side diff if I have tidied the XML. (Otherwise, it tells me the only line of XML has changed.)
I recently added output for dashboards (exploration) and data sets (dataSet2). Dashboards are stored as JSON, so my routine had to tidy that (simple in PowerShell).
I run my routine daily, getting new and modified content from the last 3 days (just in case), and weekly to do an entire dump (to capture the deletes). The weekly process takes about six minutes. The daily process is negligible.
Before you ask: I hesitate to provide actual code because I can't take any responsibility for your system.
Updates:
Hacking away at the Content Store database is not recommended and it is not supported by IBM.
For reference/comparison: I'm running IBM Cognos 11.0.7 on IIS on Windows 2012 R2 with the Content Store database on MS SQL Server 2016. Your system may be different.
Additional Resources
https://www.cognoise.com/index.php/topic,28289.msg113869.html#msg113869
https://www.cognoise.com/index.php/topic,17411.msg50409.html#msg50409
https://learn.microsoft.com/en-us/powershell/scripting/overview?view=powershell-6
https://learn.microsoft.com/en-us/sql/t-sql/language-reference?view=sql-server-2017
https://git-scm.com/docs
http://tidy.sourceforge.net/

advice about sharepoint migration from 2007 to 2010

I have to migrate default and custom metadata on forms library.
What I need to migrate are:
70 columns
15 linked lists (master data)
750 records
and from tasks lists I have to migrate the following:
- 35 columns
- 15 content types
- 2000 records
I was told that I should write powershell scrip and that's what they are expecting from me. Now, can anyone help me how to extract this information via powershell and export to sharepoint 2010? I have never used powershell, but it does not seem to be difficult since it resembles c#.
any help will be appreciated.
Thank you.
If you looking at extracting columns i would look a gary le-points stsadm extensions i think he has a few that will be able to help you.
for examplke i think gl-exportsitecolumns and gl-exportsitecontenttypes <- these may not be exact but they will be close
Worry about getting the columns and Content Types in place once you have this you could just use data sheet to copy the data across.
Powershell help with the columns
http://get-spscripts.com/2011/01/export-and-importcreate-site-columns-in.html
Powershell for the content types
http://secretsofsharepoint.com/cs/blogs/tips/archive/2011/08/24/adding-content-types-using-powershell.aspx
Hopefully this should give you most of what you are looking for :)

Given code base hosted on TFS, which command can tell me which file has changed most?

I want to find out files under a given directory which have been updated most. Is there any command which can display this info? Or is there any way to get max version count for a given file, so I can write some script to get this info from all and then sort desc.
Do you mean changed the most number of times, or undergone the most code chrun?
Either way - looking at the report data might be the easiest option for you. Take a look at the following blog post I did explaining how to use Excel for looking at TFS data that uses churn as an example allowing you to drill down into folders and files - but you should be able to get the data that you are looking for.
Getting Started with the TFS Data Warehouse

How to Increase the Max Length of the Product Description in Magento (when importing)?

Edit: This was a bogus question. The problem was that I had quotes in my description field. The entire field should be wrapped in one set of quotes with none inside. Changed quotes to apostrophes to fix. Magento is working correctly.
I am using a Profile in the Import/Export section of my Magento admin to import a CSV document.
My description fields are very long (around 10k file size). Two issues are occurring:
On the published product, only the first 50% or so of the description is present.
The Magento system does not import the next column on the import document (brief description).
Does anybody know how to fix this?
The proplem is much more likley the application you export your product data with and which creates the CSV.
Did you check if the CSV contains the full description prior to importing it? Maybe the application only allows a certain amount of characters in a column and truncates the rest.
I believe I read there is a bug (not necessarily confirmed) in the CVS import if your 'short_description' is more than a sentence or two long, it causes problems elsewhere. You've got long, long descriptions, but you didn't mention how short, your short descriptions are. Could you try importing with a one sentence 'short_description', then see what happens.
I'm not sure the protocol of recommending a commercial product here, but there's a windows program (I run it in vmware) that does imports/exports with a direct connection to the magento database, skipping the long-winded dataflow api). I've imported products from there in much faster time frames without issue. I've never had to deal with long descriptions, though. It's not cheap at $200, but the time saved has been worth for it for me. It's the first result for 'magento manager' in google.
Have you confirmed by creating by hand a single product with a huge description that magento doesn't choke on it?