Can I run LocalDb "in-memory" - localdb

I am using LocalDb in an integration testing environment - instantiating and disposing of my instance before and after my tests run.
However, when I create a database within my instance, it still flushes my tables and data to disk. Is it possible to run LocalDb in an "in-memory" mode? And if so how?

You could run your tests against a database in a RAM disk. I have done that a while ago and it seemed to improve integration tests performance by a factor of 2 or 3! This was on a Windows 7 VM hosted in a MacBook Pro with an SSD. Your milage will probably be better if you have a mechanical HDD.
Since SQL Server allows you to specify the mdf file in a connection string via "AttachDbFileName=", you can leverage that by pointing to an mdf in a RAM disk.
The RAM disk device driver I used was ImDisk, which is available for both 64 and 32 bit Windows. Linux counterparts are numerous.

No. LocalDB is still SQL Server, and in SQL Server there is no concept of an in-memory database. All databases are disk-backed, with in-memory cache sitting on top of them.
You could probably write a custom step in your testing harness to drop your databases after your tests are completed and delete database files. Maybe even it's already there if you're using TFS for build and test runs? But there's nothing in LocalDB to make it automatic.

Related

How to restore PostgreSQL database?

My Windows Server was damaged, I have installed the new OS, PostgreSQL12.
The folder \PostgreSQL\12\data is in good state.
No database dumps.
Is it possible to restore database having the folder?
As long as you're using the same architecture (32-bit vs 64-bit), and it's still on Windows Server, you should be able to use that by copying it to the right location on your new OS's filesystem and start it. Just be prepared for it to take longer to start if the database wasn't shut down cleanly as it will have to replay the transaction logs before coming online.

Read-Only program on CD. no installation permitted. Firebird database

I have to develop a software for CD. So, the project must run only in CD, without installing to computer. Installing and copying must be impossible from disc. The program is something like Language Learning program. Can I use .NET (c#) environment for such a program? And which databases can I use? (I heard about Firebird).
You could use firebird. However, you would need a connection to an external firebird server to be able to do physical operations on the data. Another solution would be to save the data on the client computer. For instance you could restore the database in a profile folder. It is unclear to me whether changes and learn progress should be saved.

ODP.NET deployment without installation

I want to deploy a client application that uses Oracle's ODP.net but I don't want to install ODP.net on every machine. Rather I'd like to copy the managed dll oracle.dataaccess.dll on every machine and have the native dlls on which it depends available, on a shared disk.
By decompiling the oracle.dataaccess.dll code I have seen that it calls a method that gets the location of the native dlls from the registry. So, in addition to copying the oracle.dataaccess.dll on every machine I would have to add the registry keys that would point to the native dlls on the shared disk.
My question: does one foresee any problem arising from that technique of odp.net deployment?
The only files you need from the latest client are:
Oracle.DataAccess.dll
oci.dll
oraociicus11.dll
OraOps11w.dll
Just make sure they get copied to the output directory and everything will work. Nothing needs to be registered anywhere. You will however need to make separate x86 and x64 builds with the respective architecture's DLLs since an Any CPU .NET application will run in 32-bit mode on a 32-bit OS and in 64-bit mode on a 64-bit OS.
1) ODP.NET is currently a mixture of managed and unmanaged DLL's. It also relies on lower level unmanaged DLLs from the Oracle client - eg for networking, etc.
2) You will need all these required ODP.NET and client DLLs on each machine you deploy to.
3) One potential solution to make this easier on you is to look into the "XCOPY" deployment package. See the ODP.NET download page. This is a smaller install and allows you to write your own custom installer. You can include these XCOPY files as part of your own install.
4) Oracle will be doing a beta of a fully managed provider in 2012 which will make this situation much better (and the total size will be a couple megabytes only).
Christian Shay
Oracle
Since they're unmanaged I'd assume that they'd be ok on a network path, though that should be easy enough to test. However I'd suggest that rather than changing the registry setting, you might be better off changing the DllPath config setting as described here.

Best Self-Hosting Solution for TFS 2010?

I want to install TFS 2010 on my own machine - a Dell Laptop with 8GB RAM, running Windows 7. Now, since installing on Win7 means I can't run SharePoint or Reports, and I don't want to reformat my machine to Win 2008, I need to virtualize.
I would like something that I can have always on, and treat like a server on my LAN, or at the very least, something that I can activate quickly, when needed. Oh, and I'd like it to be free :).
As far as I can tell, my options are MS Virtual PC, Virtual Box, VMWare.
What would be my best option? Are there any other options?
Thanks,
Assaf
You can either use MS Virtual PC or VMWare. I have been using TFS2010 installed on MS Virtual PC and its working fine.
If you want to use 8 GB RAM, you'll want to use either VMWare or repave your machine (but save the TFS databases) as Windows Server 2008 R2 and use Hyper-V.
You can then install TFS 2010 again but point it at your set of restored databases. You'll be able to enable the SharePoint and Reporting for your newly restored TFS instance.
I've ran it on a VM from my Dev box and the performance wasn't the best. Memory and disk IO are very important when running SQL and the competition with multiple instances of Visual Studio, plus the overhead of VMWare made it unbearable. With enough memory and RAID or a SSD, you may be okay.
I know it's not free, but there are a few hosted solutions that are decently priced (TFS Server Hosting). They also allow you to access it from anywhere and your code will be backed up.

Setting up staging/qa/dev environments on Windows Server 2008

I am working on a large scale ASP.NET web app.
The system is large enough to warrant monitoring systems, build scripts, source control server, etc etc.
I now want to setup a proper development environment whereby I have a development server, QA and staging.
I am going to be setting up Windows Server 2008 Standard Edition x64 (I have 4gb RAM so want to see all of it).
Would I just have to setup a VM for each environment? But the question this raises is that at the moment all my software is on Vista. It'd be good for each VM to have only the software it needs (e.g. I won't need Visual Studio on staging as I shouldn't change code on there) but I guess this can't be done? Should src control be on a central location and not in one of the environments (e.g. dev)? So something like:
Source control server
v
v
DEV
v
v
QA
v
v
Staging
And thus everything is decentralised.
How do you go about this?
Your idea of using 4 VMs should work well.
I don't see a database server of any type. I've found DBs do better on their own physical machine.
ADD MORE MEMORY!!!!! You want each machine to have enough memory.