When selecting 'update model from database' none of the system tables (SYS. schema) is available from the list of tables.
How may I add a system table to my EF model.
Sybase (ASA12) is the database platform which I am using.
As a workaround I created a view on the system table.
It is then available and may be updated automated by the edmx generator
I created a script that recreates all the catalog views, i.e. sys.*, as views in a user schema:
Note: This is T-SQL, and SQL Server object names, but I'm sure you can adapt the concept to Sybase.
SELECT
'CREATE VIEW ' + 'dpc.' + name + ' AS SELECT * FROM ' + 'sys.' + name + char(13) + char(10) + ' GO' + char(13) + char(10)
FROM
sys.all_objects
WHERE
type = 'v'
and is_ms_shipped = 1
and schema_name(schema_id) = 'sys'
ORDER BY
name
Then I ran the script output by the above query, which copied each sys.x view to a new dpc.x view, and added all the dpc.* views to my EDMX model.
Related
I'm using SSIS 2008 and trying to work on a package for importing a specified file into a table created for its layout. It will take in the destination table & source file as package variables.
The main problem I'm running into is that the file layouts are subject to change, they're not consistent. The table I'd be importing into will match the file though. I had initial success, but soon after changing the source file/destination it throws the vs_needsnewmetadata error.
Are there any workarounds discovered that could potentially be used here for files not fitting the layout the package was designed with?
Edit: These are .txt files, tab-delimited.
Edit2: Tried fiddling with OPENROWSET as well, hit a security error on our server.
I am assuming here that said file is a CSV file.
I have just been faced with the exact same problem a couple of weeks ago. You need to use dynamic SQL to achieve this.
Create a stored procedure on your database with the code below (change the 2 "C:\Folder\" locations to the location of your file):
CREATE PROCEDURE [dbo].[CreateAndImportCSVs] (#FILENAME NVARCHAR(200))
AS
BEGIN
SET NOCOUNT ON;
DECLARE #PATH NVARCHAR(4000) = N'C:\Folder\' + #FILENAME + ''
DECLARE #TABLE NVARCHAR(50) = SUBSTRING(#FILENAME,0,CHARINDEX('.',#FILENAME))
DECLARE #SQL NVARCHAR(4000) = N'IF OBJECT_ID(''dbo.' + #TABLE + ''' , ''U'') IS NOT NULL DROP TABLE dbo.[' + #TABLE + ']
SELECT * INTO [' + #TABLE + ']
FROM OPENROWSET(''MSDASQL''
,''Driver={Microsoft Access Text Driver (*.txt, *.csv)};DefaultDir=C:\Folder;''
,''SELECT * FROM ' + #FILENAME + ''')'
EXEC(#SQL)
END
You might need to download the Microsoft Access Database Engine from:
https://www.microsoft.com/en-gb/download/details.aspx?id=13255
and install on your machine/server for the Microsoft Access Text Driver to work.
Then create an Execute SQL Task in SSIS with the relevant connection details to your SQL server database. Then pass the file name to the stored procedure you created:
EXEC dbo.CreateAndImportCSVs 'filename.csv'
It will then create the table based on the structure and data contained within the CSV, it also names the table the same as the csv file name.
*This stored procedure can also be used to run through a list of files.
Hope this helps!
My skills in SQL are limited:
I have a database (SQLBase in this case) that has a couple of LONGVARs in columns.
I'm searching for the actual length of all COLUMNS that have a particular type.
SELECT tbname,name FROM sysadm.syscolumns where coltype='LONGVAR';
The above statement works. It gives me all tables and the respective column names that have a LONGVAR datatype. Now I would like to take these data and search through all the respective tables (the rows, so the data) and find the lengths of the respective LONGVAR columns (to find the maximum for instance, or those above a certain limit).
I have the idea that it can be solved with a subquery of nested SELECT statement but have no idea how to formulate the statement.
I don't have any real knowledge of SQLbase, so I may be off-base here: but if I was trying to do this on SQL Server, a simple approach would be to do something like the following:
SELECT
tbname,
name,
'SELECT ''' + tbname + ''' AS TableName, ''' + name + ''' AS ColumnName, MAX(LEN(' + name + ')) AS ColumnLength FROM ' + tbname + ' -- add a WHERE clause here if needed' AS Query
FROM sysadm.syscolumns
WHERE coltype='LONGVAR';
This will output a set of values, which you could then copy/paste into a new query editor window and examine before running.
Other, more complex solutions would involve dynamic SQL that automatically executes each of these statements; but again, not knowing much about SQLbase, this is where I would start.
I always thought a synonym in T-SQL was just a convenient abbreviation. Yet when I do the following ...
create synonym BACKUP_TABLE for T_SHORT_NAMES_BACKUP
go
select *
into BACKUP_TABLE
from T_SHORT_NAMES
... I get the error that there is already an object called BACKUP_TABLE. Am I doing something wrong?
Synonyms are pointers to other SQL tables. They are extremely useful depending on what you are wanting to do. You can point them to a table in another database, or a table on another server (through a linked server). We leverage them a lot in our ETLs
The process I use to generate mine:
Query to build synonyms dynamically:
SELECT
'CREATE SYNONYM [dbo].[' +TABLE_NAME+ '] FOR [' + 'Put database name here or remove' + '].[dbo].['+TABLE_NAME+']'
FROM
INFORMATION_SCHEMA.TABLES
WHERE
TABLE_TYPE = 'BASE TABLE'
From there, you just SELECT * FROM TABLE_NAME
Now, to circle back to your question. You create a synonym for BACKUP_TABLE that points to T_SHORT_NAMES_BACKUP.
Try: SELECT * FROM BACKUP_TABLE
To find out more about your synonyms: SELECT name, base_object_name FROM sys.synonyms
As select ... into ... always creates a new table object with the given name there must not exist a object with the same name already.
Simply use your select ... into ... standalone, there is no need to add a synonym.
But if you want to add additional rows to your T_SHORT_NAMES_BACKUP use insert into ... select ... instead:
insert into
T_SHORT_NAMES_BACKUP
select
*
from
T_SHORT_NAMES
I want to create a T-SQL query which are deleting all rows in the table Logins in ALL databases containing this exact table, so it can be run without any errors.
I want to reuse the code to other stuff as well, e.g. finding all active users from all databases containing the table Users. Therefore, I think, the best solution would be a pure T-SQL solution. This way the query can even become an automated job run by SQL Server Agent
Is it possible? And how?
Build some dynamic SQL:
declare #sql varchar(max) = ''
select #sql = #sql +
'use [' + name + '] ' +
'if exists (select * from sys.tables where name = ''Logins'') ' +
'delete from Logins '
from sys.databases where name not in ('master','model','msdb','tempdb')
print #sql
--exec (#sql)
Uncomment the exec line to actually run the code rather than just see what would be executed.
We've got a system (MS SQL 2008 R2-based) that has a number of "input" database and a one "output" database. I'd like to write a query that will read from the output DB, and JOIN it to data in one of the source DB. However, the source table may be one or more individual tables :( The name of the source DB is included in the output DB; ideally, I'd like to do something like the following (pseudo-SQL ahoy)
select o.[UID]
,o.[description]
,i.[data]
from [output].dbo.[description] as o
left join (select [UID]
,[data]
from
[output.sourcedb].dbo.datatable
) as i
on i.[UID] = o.[UID];
Is there any way to do something like the above - "dynamically" specify the database and table to be joined on for each row in the query?
Try using the exec function, then specify the select as a string, adding variables for database names and tables where appropriate. Simple example:
DECLARE #dbName VARCHAR(255), #tableName VARCHAR(255), #colName VARCHAR(255)
...
EXEC('SELECT * FROM ' + #dbName + '.dbo.' + #tableName + ' WHERE ' + #colName + ' = 1')
No, the table must be known at the time you prepare the query. Otherwise how would the query optimizer know what indexes it might be able to use? Or if the table you reference even has an UID column?
You'll have to do this in stages:
Fetch the sourcedb value from your output database in one query.
Build an SQL query string, interpolating the value you fetched in the first query into the FROM clause of the second query.
Be careful to check that this value contains a legitimate database name. For instance, filter out non-alpha characters or apply a regular expression or look it up in a whitelist. Otherwise you're exposing yourself to a SQL Injection risk.
Execute the new SQL string you built with exec() as #user353852 suggests.