Connect/read FoxPro/.dbf file, from TSQL - tsql

This seems like it should be easy enough, but I'm having trouble in what feels like the final stretch..
I want to connect to a Fox Pro .dbf file
1.) I've applied the following changes on my local SQL instance. All the code was found via various google results (I'm not an expert on this at all)
USE [master]
GO
sp_configure 'show advanced options', 1
RECONFIGURE
GO
sp_configure 'Ad Hoc Distributed Queries', 1
RECONFIGURE
GO
EXEC master.dbo.sp_MSset_oledb_prop N'VFPOLEDB', N'AllowInProcess', 1
RECONFIGURE
GO
EXEC master.dbo.sp_MSset_oledb_prop N'VFPOLEDB' , N'DynamicParameters' , 1
RECONFIGURE
GO
2.) When I run the following (which by the sound of it, is where the magic should happen)
select * from
openrowset('VFPOLEDB','\\path_segment\path_segment\clock.dbf';'';'','SELECT * FROM clock')
I get an error..
OLE DB provider 'VFPOLEDB' cannot be used for distributed queries because the provider is configured to run in single-threaded apartment mode.
... and that's basically where my google skills end, any ideas on what I can do to get the above working?

Not sure if it can help, but I posted an answer for this thread for someone trying to deal with converting VFP tables to SQL. The starting point I offered the person appeared to have him on the right track.
It deals with linked server, using the VFP OleDB driver (ensure you have the latest version). The connection string should point to the PATH where the .dbf files are located, then query from them by their name (you do not explicitly need the .dbf suffix) for your querying.

Related

How to transfer Mongo Database from one remote server to another

I need to transition several databases from one remote, cloud-based server/service (modulus.io) to another (Compose.io). As far as I'm aware, I don't have console access on the target server, which seems to be required for using mongocopy or mongorestore. I have all of the credentials. How do I do this? What command should I use, or is there a tool designed for the purpose?
I'm currently trying to use mongodump to move the database to my local machine, and then try to mongorestore it to the target machine. This is going very slowly, even for a modestly sized database (<2GB) it looks like it will take most of a day to download).
Thanks
In the Compose.io Web UI, create a new DB and click "import". There you can choose the source DB to import. Works every time! :)
I don't think this feature is available on the free tier.

SQL Service Broker creating objects in SQL Server Database Project in VS 2012

So I've started a SQL Server database project inside VS 2012. I have done this for other databases already but not related to Service Broker.
For testing I had already created db, queues, etc through a T-SQL script including Message Types which was in an XML format. i.e.
[//blah.com/Items/RequestItem]
When I try to do something like this in the DB Project it's not allowing me too due to special chars.
Anyone done this? Gotten around it?
Is there a way to simply put my already created T-SQL file in the database project and have it use it?
See my comment above. I was able to import the script by Right clicking on the database Project.

Import Excel 2010 into SQL Server

I use Excel to collect & configure data, then import it into SQL Server 2012 for storage.
So far I've been using the SQL Server Import & Export Wizard, but it is a pain to manually set it up constantly. Since I'm using Express, of course it won't allow me to save, or even view, the actual commands to transfer the data.
I tried to set up a linked server, per How to use Excel with SQL Server linked servers and distributed queries, but get the following error:
The linked server has been created but failed a connection test. Do you want to keep the linked server?
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)
Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "FLTST".
OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "FLTST" returned message "Unspecified error". (Microsoft SQL Server, Error: 7303)
I thought perhaps the Excel version number was the problem, since the web page is from 2005, so I tried with:
Excel 8.0 (Excel 2002) as shown on the page
Excel 12.0 (Excel 2007) which is what the wizard seems to use
Excel 14.0 (Excel 2010) what I actually have
All of those gave me identical results.
Next I tried the distributed query as shown at Import excel file to SQL Server Express, (again with different variations of the provider string)
USE ExTest
SELECT * INTO TstTbl FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 14.0;Database=c:\ExTest.xlsm', [Contacts])
go
Which gives me the following error:
OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "Unspecified error".
Msg 7303, Level 16, State 1, Line 3
Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)".
Instead of going to SQL Server & pulling the data in, should I stay in Excel & push it over?
What am I doing wrong?
PS: Please don't tell me to convert it to a csv file! I'm trying to do fewer steps, not more!
Having similar issues as you have in your question I have done some research on this. My issue is not yet fully resolved but I think I might get you one step further. Although the question is old there is perhaps someone else who needs the help.
By running:
SELECT *
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'Excel 12.0 Xml;HDR=YES;Database=P:\Path\File.xlsx','SELECT * FROM [Sheet1$]');
GO
I get the following error message:
Msg 15281, Level 16, State 1, Line 19
SQL Server blocked access to STATEMENT 'OpenRowset/OpenDatasource' of component 'Ad Hoc Distributed Queries' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'Ad Hoc Distributed Queries' by using sp_configure. For more information about enabling 'Ad Hoc Distributed Queries', search for 'Ad Hoc Distributed Queries' in SQL Server Books Online.
To resolve that I run the following:
sp_configure 'show advanced options', 1
RECONFIGURE
GO
sp_configure 'ad hoc distributed queries', 1
RECONFIGURE
GO
But I get a new error mesasge:
Msg 7302, Level 16, State 1, Line 19
Cannot create an instance of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".
To rectify that I run:
EXEC sp_MSSet_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'AllowInProcess', 1
GO
EXEC sp_MSSet_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'DynamicParameters', 1
GO
But I get this error in stead:
Msg 7438, Level 16, State 1, Line 19
The 32-bit OLE DB provider "Microsoft.ACE.OLEDB.12.0" cannot be loaded in-process on a 64-bit SQL Server.
In my case I have asked the IT department to install a 64 bit version of excel on the server and I hope that should be the end of the technical problems when importing from excel.
To clean up afterwards I disable the settings I just enabled:
EXEC sp_MSSet_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'AllowInProcess', 0
GO
EXEC sp_MSSet_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'DynamicParameters', 0
GO
sp_configure 'ad hoc distributed queries', 0
RECONFIGURE
GO
sp_configure 'show advanced options', 0
RECONFIGURE
GO
Create an SSIS package with Excel data source connection manager, destination is your SQL express, OLE DB destination
When you create Excel connection manager, you can just use one existing excel file
Define one user variable, like user::sourceFile, which is used to input excel file full path
After Excel connection manager is created, right click-> preperties-> find the "Expression", just give your [User::sourceFile] to the Expression
Just create one simple data flow from your source to destination
Save and debug your SSIS package, make sure all credential works and data can flow into destination table. Note: don't save sensitvie data in your package with ecrypted by machine key
Each time when you need to load a new file, use DTEXEC to execute package and override the parameter
good luck

Is there an easy way to set up ASP.NET Membership tables in a custom Database?

ASP.NET Membership is just great as there are a ton of functionality right there to be used, and we don't need to change nothing at all.
We can even create our own Provider based on Membership database, and that give us infinite possibilities, like as I don't like the Question/Answer I just use an email that is sent with a reset link.
But this is all done with SQLEXPRESS .mdf file and I wanted to use my own Database for this so I can use SQL Server Enterprise as we have in the Office and not the Express Edition.
How can I easily use the ASP.NET Membership tables in my own Database?
I rememebered some years ago that we needed to use aspnet_reg (something) to create the correct tables, but I can't find that info anymore.
I also tried to use other Membership Providers, namely Altairis.Web.Security from CodePlex and saw the Chris Pels Video on creating a new Membership Provider
On Altairis solution, the Model is not complete and lack several points such as Several Applications as it's made to be used with only one, and Chris Pels contains to much Store Procedures that I need to create by hand.
I'm for given Chris code a go but I just wanted to know if there would be available something easier.
All this is to be integrated in ASP.NET MVC 2 Web Application.
Thanks
You have 3 options:
Do it by running aspnet_regsql.exe: Just open "Start Menu>All Programs>Microsoft Visual Studio 2010>Visual Studio Tools>Visual Studio Command Prompt(2010)" and then type aspnet_regsql. A wizard appears and let you select your desired database.
Do it via API: Use System.Web.Management.SqlServices class and its Install and Uninstall methods. This will programmatically install/uninstall database artifacts.
Do it manually: Go to C:\Windows\Microsoft.NET\Framework\v4.0.30319 or something like. You will find 9 .sql files that begins with Install and 9 .sql files that begins with uninstall. You can run them manually in your database to create needed tables/store procedures/etc. But consider changing database name in sql scripts. Default db name is aspnetdb.
It's aspnet_regsql
Under the following path:
C:\windows\Microsoft.NET\Framework\v4.0.30319\aspnet_regsql
You need to open up the Visual Studio Command Prompt, navigate to C:\WINDOWS\Microsoft.NET\Framework\\aspnet_regsql.exe . Some example are located in the MSDN documentation.
Example: aspnet_regsql.exe -E -S localhost -A mr - installs the database elements for membership and role management on the local computer running SQL Server using Windows authentication.

Database migrations: manage with build script or automatic on app startup?

I'm in the process of developing a deployment system for a new web app and I'm wondering where the best point in the process to manage database migrations is (the question of how to do the migrations is another problem entirely).
It seems there are two ways to go:
Use a migration script that can
either be run manually from command
line or as part of the automatic
deployment/build process
Run the migrations when the app
starts up (I'm using ASP.NET so this
can be done easily enough without
causing a long-running user request)
Does anyone have any suggestions/insight/experience with these approaches? Any other suggestions?
I can see why #1 might be more attractive - it gives me complete control over when the DB is updated. However, I quite like #2 as it allows me to quickly iterate between deployments and reduces the manual process. #2 could also be used on my development machine to allow even quicker iterations. Hmm, starting to think having both might be a good thing...
We have a sales-force system with ~100 client and we are updating database at application startup (True, our is a desktop application.) I like this approach, it's safe and iterative if we have indeterministic startpoint (is the client database new or only updated to verison x.y.z?).
But at serverside I'm preferr your #1 option: we create a SQL query file on our virtual machine (based on the copy of the original database) and runs this query against the real server.
So IMHO:
Disconnected clients: startup, iterative scripts
Server: query created on VM based on the actual and real database
So I'm interrested in this problem too, and find some (half)frameworks as RikMigrations. After some googling there is a good startplace about DB versioning/migration frameworks: .NET Database Migration Tool Roundup. Not neccessarely the documentation but the team blogs can be interresting.
I like option #1 better as it seems much more flexible. In lieu of actually performing migrations on each app start, I think I would verify that the database schema (version number?) matches the code, and if not, throw a warning or error about a mismatched database schema.
I'd prefer option #1 for a number of reasons. First, integration tests usually require your DB schema to be up-to-date, and launching a web-site to upgrade the schema will be a huge timewaster. Second, you cannot change database schema while your site is running (say, add a couple of indexes to speed things up).
As for production side of things, upgrading your database in transaction MSI-style installation is much better than attempting to upgrade at each app startup since you can potentially end up with desynchronized database-application versions.
And if you're looking for the migration framework, take a look at Wizardby.
If the application ever has to run on a customer's machine than migrating at startup can prevent a lot of support calls - assuming you can do seamless migration without user intervention (I hope you aren't normally running your web app with permission to modify the database).
If the application always runs under your control automatic migration is less of an issue - but still can be a good feature, especially if you want to minimize downtime and manual deployment steps.