I've searched but haven't been able to find an answer to this question. Currently our Db users prefixes of tables - e.g. tblUsers. I've updated the EF templates to remove the "tbl" from the generated class names. However I still can't figure out how to change the output file name to match.
Is it possible or am I asking for the moon? I’m using EF Power Tools Beta 3 in VS 2012. Any help would be GREATLY appreciated!
Patrick, what you need is to modify the T4 template used by the EF Power Tools. When you want to create a code-first with all the mappings, instead of Reverse Engineer Code First option, choose Customize Reverse Engineer Template. You should get three files:
Context.tt
Entity.tt
Mapping.tt
For example, in Mapping.tt there is a line that reads MetadataProperties from a TableSet, and which extracts Table name. The line looks like this:
var tableSet = efHost.TableSet;
var tableName = (string)tableSet.MetadataProperties["Table"].Value ?? tableSet.Name;
This is where you need to make changes and do something like:
var newTableName = tableName.Replace("tbl", String.Empty);
Of course, you should opt for a different strategy and use Substring method or Regular expression to read and remove the first three characters. After that you have to go through the tt file and use your logic where you want to use tableName and where to user newTableName variable. You will keep tableName where the mapping is done with the table in a database, and newTableName where you want to use that name for your POCO classes and filenames.
Repeat the process for the other two files. For more information have a look at Rowan Miller's blog article. This should give you a pretty good idea how to proceed.
Related
I have a PostgreSQL database. I had to extend an existing, big table with a few more columns.
Now I need to fill those columns. I tought I can create an .csv file (out of Excel/Calc) which contains the IDs / primary keys of existing rows - and the data for the new, empty fields. Is it possible to do so? If it is, how to?
I remember doing exactly this pretty easily using Microsoft SQL Management Server, but for PostgreSQL I am using PG Admin (but I am ofc willing to switch the tool if it'd be helpfull). I tried using the import function of PG Admin which uses the COPY function of PostgreSQL, but it seems like COPY isn't suitable as it can only create whole new rows.
Edit: I guess I could write a script which loads the csv and iterates over the rows, using UPDATE. But I don't want to reinvent the wheel.
Edit2: I've found this question here on SO which provides an answer by using a temp table. I guess I will use it - although it's more of a workaround than an actual solution.
PostgreSQL can import data directly from CSV files with COPY statements, this will however only work, as you stated, for new rows.
Instead of creating a CSV file you could just generate the necessary SQL UPDATE statements.
Suppose this would be the CSV file
PK;ExtraCol1;ExtraCol2
1;"foo",42
4;"bar",21
Then just produce the following
UPDATE my_table SET ExtraCol1 = 'foo', ExtraCol2 = 42 WHERE PK = 1;
UPDATE my_table SET ExtraCol1 = 'bar', ExtraCol2 = 21 WHERE PK = 4;
You seem to work under Windows, so I don't really know how to accomplish this there (probably with PowerShell), but under Unix you could generate the SQL from a CSV easily with tools like awk or sed. An editor with regular expression support would probably suffice too.
The question pretty much sums it up. I've got to replace text in a large number for store procedures. Its not so many that doing it manually is impossible, but enough that I'm asking the question. I also prefer automation as it reduces the change of user error when we make the change in production.
I can Identify them like this:
select OBJECT_DEFINITION(object_id), *
from sys.procedures
where OBJECT_DEFINITION(object_id) like '%''MyExampleLiteral''%'
order by name
Is there any way to mass update them all to change 'MyExampleLiteral' to 'MyOtherExampleLiteral'?
I'd even settle for a way to open all the stored procs. Just Finding these store procs in a larger list will take some time.
I thought about generating alter statements using the above select statements, but then I lose line breaks.
Thanks in advance,
This is a Microsoft SQL Server.
There are different tools to use depending on the database in question. For example, Microsoft SQL Server Data Tools integrates with Visual Studio, and allows you to do these types of operations fairly easily. The database is stored in your solution as scripts, which you can then search and replace any keyword you wish. I'm assuming there would be similar tools available for other platforms.
You could do this with dynamic sql. Query the system tables to get all the SPs containing your "MyExampleLiteral":
SELECT [object_id] FROM sys.objects o
WHERE type_desc = 'SQL_STORED_PROCEDURE'
AND is_ms_shipped = 0
AND OBJECT_DEFINITION(o.[object_id]) LIKE '%<search string>%'
Then, write a while loop to go through those object_ids. In the while loop, get the OBJECT_DEFINITION() into a string and replace the "MyExampleLiteral", then replace CREATE PROCEDURE with ALTER PROCEDURE and execute the string using sp_executesql.
Doing something this crazy, make sure you backup the database first.
Is it possible to pass an argument to Oil that will allow the new table field to be null?
something like
oil g migration foo bar:string null baz:int
Thanks
Of course it is possible to parse most of the parameters used to created table in generating migration. Parameters required must be separated by colon like below:
oil g migration foo bar:string:null baz:int:unsigned
The short answer is "No".
What you are supposed to do is create your migrations using the allowed syntax and then edit the migration files located in
app/migrations.
After you have updated the migration file you can
run oil refine migrate
I have a XML Document file. The part of the file looks like this:
-<attr>
<attrlabl>COUNTY</attrlabl>
<attrdef>County abbreviation</attrdef>
<attrtype>Text</attrtype>
<attwidth>1</attwidth>
<atnumdec>0</atnumdec>
-<attrdomv>
-<edom>
<edomv>C</edomv>
<edomvd>Clackamas County</edomvd>
<edomvds/>
</edom>
-<edom>
<edomv>M</edomv>
<edomvd>Multnomah County</edomvd>
<edomvds/>
</edom>
-<edom>
<edomv>W</edomv>
<edomvd>Washington County</edomvd>
<edomvds/>
</edom>
</attrdomv>
</attr>
From this XML file, I want to create a PostgreSQL table with columns of attrlabl, attrdef, attrtype, and attrdomv. I appreciate your suggestions!
While Erwin is right that this can be done with PostgreSQL tools, I would suggest still going the custom translation yourself as there are a few reasons here.
The first is determining appropriate XML to PostgreSQL type conversions. You probably want to choose these yourself. But this example highlights a very different problem, what to do with nested data structures. You could, for example, store XML fragments. You could store text, json, or the like. You could create other tables and fkey in.
In general I have almost always found the best approach is to simply manually create the tables. This substitutes human judgement for automated mappings and allows you to create better matches than a computer will.
First question to SO, I hope I'm doing this right. ;)
Regarding System.Data.Entity.Design.EntityStoreSchemaFilterEntry :
I'm looking for some detailed documentation on this class. The MSDN docs have nothing but an indication of what properties exist and their data types. I want to create a well-defined list of filters for
EntityStoreSchemaGenerator.GenerateStoreMetadata(
IEnumerable<EntityStoreSchemaFilterEntry> filters
)
Specifically:
Do we need to set all Excludes before the Allows so that Allow entries are the only ones that are returned?
What are the consequences of using null in any of the parameters? What about empty string "" ? Comments about this seem to be conflicting and don't match my experience with their usage.
Is the proper "all" wildcard a simple "%"?
My goal is to Exclude all Tables, Views, and Filters, then Allow just the ones that I want. If I try to do this I get an edmx file with no entities. It seems my Exclude All takes precedence over all of the tables that I tried to include. If I don't try to exclude tables that I don't want, I get the tables I've Allowed plus all other tables in the database, which sort of renders filtering useless.
For reference, the only info I can find about proper wildcard patterns for filters is here:
http://msdn.microsoft.com/en-us/library/ms710171(VS.85).aspx
Note that I've gone way beyond EdmGen, noted bugs and limitations in EdmGen2, and am now trying to accomplish what I need with a majorly extended EdmGen2 base.
Thanks!
Related keywords to assist people searching on this topic:
AEF ADO.NET Entity Framework
Tables Views Functions
EntityStoreSchemaFilterObjectTypes EntityStoreSchemaFilterEffect
EntityStoreSchemaGenerator GenerateStoreMetadata
EntityModelSchemaGenerator
SSDL CSDL MSL EDMX
EdmGen EdmGen2
I found the follow filters were sufficient to generate the SSDL for a single table.
List<EntityStoreSchemaFilterEntry> filters = new List<EntityStoreSchemaFilterEntry>();
// Just generate for the Document table.
filters.Add(new EntityStoreSchemaFilterEntry(null, "dbo", "TargetTableNameHere", EntityStoreSchemaFilterObjectTypes.Table, EntityStoreSchemaFilterEffect.Allow));
filters.Add(new EntityStoreSchemaFilterEntry(null, "dbo", "%", EntityStoreSchemaFilterObjectTypes.Function, EntityStoreSchemaFilterEffect.Exclude));
// generate the SSDL
string ssdlNamespace = modelName + "Model.Store";
EntityStoreSchemaGenerator essg = new EntityStoreSchemaGenerator(provider, connectionString, ssdlNamespace);
essg.GenerateForeignKeyProperties = includeForeignKeys;
IList<EdmSchemaError> ssdlErrors = essg.GenerateStoreMetadata(filters, version);
I only needed to explicitly exclude the functions.