Dapper QueryMultiple Stored Procedures w/o mapping to Objects - tsql

With dapper, I can do batch execute for Stored Procedures, something similar to:
connection.Execute(#"
exec sp1 #i = #one, #y = #two
exec sp2 #i = #three",
new { one = 1, two = 2, three = 3 });
However, the only means of retrieving data that I have seen till now is by using
results.Read<Type>()
What if the results don't map to an object? For instance, I am writing "generic" code to execute any SP with variable in/out parameters & result sets.
Thanks

What API do you want? If you can process the grids separately: do that:
using(var multi = connection.QueryMultiple(...))
{
while(!multi.IsConsumed) {
// ...
}
}
where ... has access to:
Read() for dynamic rows - noting that each row also implements IDictionary<string,object>
Read<T>() for typed rows via generics
Read(Type) for typed rows without generics
Read<DapperRow>() (actually, this is just the T that Read<T>() uses to implement Read(), but perhaps more convenient), which provides slightly more access to metadata
If you want to drop to a raw IDataReader, do that:
using(var reader = connection.ExecuteReader(...)) {
// whatever you want
}
With regards to parameters: the DynamicParameters class provides much richer access to parameter control, including parameter-direction etc.

Related

Print complete SQL for all queries made by objection.js

I'm looking for a way to capture the raw SQL for all the queries that the Objection.js library executes with the bindings interpolated into the SQL string.
I realize that there's a Knex event handler that I can take advantage of but the second argument to the on('query', data) is an object containing an SQL template with the bindings separate.
e.g.
{
sql: "select \"accounts\".* from \"accounts\" where \"id\" = ?",
bindings: [1]
}
I'm wondering if the most elegant way to do this would be to use something like the .toString() method that exists on the QueryBuilder but I don't think a specific instance of a QueryBuilder is available in the callback. Ideally I don't reinvent the wheel and re-write Knex's interpolation method.
Any pointers would be greatly appreciated.
Thank you!
You can use the .toKnexQuery() function to pull out the underlying knex query builder and gain access to .toSQL() and .toQuery().
I tested and verified the following example using version 2 of Objection. I couldn't find .toKnexQuery() in the version 1 docs and therefore can't verify it will work with earlier versions of Objection.
// Users.js
const { Model } = require('objection')
class Users extends Model {
static get tableName() { return 'users' }
// Insert jsonSchema, relationMappings, etc. here
}
module.exports = Users
const Users = require('./path/to/Users')
const builder = Users.query()
.findById(1)
.toKnexQuery()
console.log(builder.toQuery())
// "select `users`.* from `users` where `users`.`id` = 1"
console.log(builder.toSQL())
// {
// method: 'select',
// bindings: [ 1 ],
// sql: 'select `users`.* from `users` where `users`.`id` = ?'
// }
It should probably be reiterated that in addition to .toString(), .toQuery() can also be vulnerable to SQL injection attacks (see here).
A more "responsible" way to modify the query might be something like this (with MySQL):
const { sql, bindings } = Users.query()
.insert({ id: 1 })
.toKnexQuery()
.toSQL()
.toNative()
Users.knex().raw(`${sql} ON DUPLICATE KEY UPDATE foo = ?`, [...bindings, 'bar'])
Knex / objection.js does not provide any methods that can securely do the interpolation. .toString() can produce invalid results in some cases and they can be vulnerable to sql injection attacks.
If it is only for debugging purposes looking how .toQuery() is implemented helps. https://github.com/knex/knex/blob/e37aeaa31c8ef9c1b07d2e4d3ec6607e557d800d/lib/interface.js#L12
knex.client._formatQuery(sql, bindings, tz)
It is not a public API though so it is not guaranteed to be the same even between patch versions of knex.

How to enumerate over columns with tokio-postgres when the field types are unknown at compile-time?

I would like a generic function that converts the result of a SQL query to JSON. I would like to build a JSON string manually (or use an external library). For that to happen, I need to be able to enumerate the columns in a row dynamically.
let rows = client
.query("select * from ExampleTable;")
.await?;
// This is how you read a string if you know the first column is a string type.
let thisValue: &str = rows[0].get(0);
Dynamic types are possible with Rust, but not with the tokio-postgres library API.
The row.get function of tokio-postgres is designed to require generic inference according to the source code
Without the right API, how can I enumerate rows and columns?
You need to enumerate the rows and columns, doing so you can get the column reference while enumerating, and from that get the postgresql-type. With the type information it's possible to have conditional logic to choose different sub-functions to both: i) get the strongly typed variable; and, ii) convert to a JSON value.
for (rowIndex, row) in rows.iter().enumerate() {
for (colIndex, column) in row.columns().iter().enumerate() {
let colType: string = col.type_().to_string();
if colType == "int4" { //i32
let value: i32 = row.get(colIndex);
return value.to_string();
}
else if colType == "text" {
let value: &str = row.get(colIndex);
return value; //TODO: escape characters
}
//TODO: more type support
else {
//TODO: raise error
}
}
}
Bonus tips for tokio-postgres code maintainers
Ideally, tokio-postgres would include a direct API that returns a dyn any type. The internals of row.rs already use the database column type information to confirm that the supplied generic type is valid. Ideally a new API uses would use the internal column information quite directly with improved FromSQL API, but a simpler middle-ground exists:-
It would be possible for an extra function layer in row.rs that uses the same column type conditional logic used in this answer to then leverage the existing get function. If a user such as myself needs to handle this kind of conditional logic, I also need to maintain this code when new types are handled by tokio-postgresql, therefore, this kind of logic should be included inside the library where such functionality can be better maintained.

How to do a nested WHERE in Zend Framework 1 using Zend_Db_Select with AND and OR operators?

So... I've inherited this rather large project based on Zend Framework 1.12 and a feature that I'm trying to add involves a more complex database operation than what I'm used to doing with the project. I'm extremely new to Zend Framework, or MVC for that matter. I found an answer for Zend Framework 3 which would have been perfect but I have to make do with this version.
The function basically builds a Zend_Db_Select based on various parameters and the feature I'm trying to add will involve joining two different tables and checking if a specific combination exists in one or the other.
Here's what I have so far:
//SQL that I'm trying to do. Assume table1 and table2 are already joined.
//Ignore the imperfect syntax. I'm trying to get the concept across.
//SELECT * FROM (table1 joined to table2 by a common key)
//WHERE ( (table1.column1 = myParam1) AND (table1.column2 = myParam2) )
//OR WHERE ( (table2.column1 = myParam1) AND (table2.column2 = myParam2) )
public function buildSelect($params){
//Zend code starts here
//This one starts the Zend_Db_Select
$select = $this->select();
$table1Name = get_table_name_from_object($table1);
//lots of preexisting code here
//my code starts here.
$table2Name = get_table_name_from_object($table2);
$select->join($table2Name, "$table1Name.key = $table2Name.key", array('column1', 'column2', 'key');
//After I wrote this, I instantly realized why it won't work the way I intended it but putting it here to show what I tried at which point I got stuck.
$select->where("($table1Name.column1 = ?) OR ($table2Name.column1 = ?)",$params[1]);
$select->where( "($table1Name.column2 = ?) OR ($table2Name.column2 = ?)", $params[2]);
//more preexisting code below.
return $select
}
Obviously, if I tried this as is, the program will happily return results that include a combination of, say, an entry where param1 is in table1.column1 and param2 is in table2.column2.
I received some feedback from a friend and posting here for posterity.
They noticed my code already contains parentheses and recommended that I simply take advantage of orWhere() then write it like this:
$select->where("($tableName1.column1 = ?", $params[param1])
->where("$tableName1.column2 = ?)", $params[param2]);
$select->orWhere("($tableName1.column1 = ?",$params[param1])
->where("$tableName2.column2 = ?)",$params[param2]);

EF builds EntityCollection, but I (think I) want IQueryable

I have an entity A with a simple navigation property B. For any given instance of A, we expect several related thousand instances of B.
There is no case where I call something like:
foreach(var x in A.B) { ... }
Instead, I'm only interested in doing aggregate operations such as
var statY = A.B.Where(o => o.Property == "Y");
var statZ = A.B.Where(o => o.CreateDate > DateTime.Now.AddDays(-1));
As far as I can tell, EF instantiates thousands of references to B and does these operations in memory. This is because navigation properties use EntityCollection. Instead, I'd like it to perform these queries at the SQL level if possible.
My current hunch is that Navigation Properties may not be the right way to go. I'm not attached to EF, so I am open to other approaches. But I'd be very interested to know the right way to do this under EF if possible.
(I'm using EF4.)
CreateSourceQuery seems to do the trick.
So my examples would now be:
var statY = A.B.CreateSourceQuery().Where(o => o.Property == "Y");
var statZ = A.B.CreateSourceQuery().Where(o => o.CreateDate > DateTime.Now.AddDays(-1));
There's one thing you should know. Members that derives from IQueryable<> are executed on the server, not in memory. Members which are derived from IEnumerable<> is executed in memory.
for example
var someEntities = db.SomeEntities; <-- returns an IQueryable<> object. no data fetched. SomeEntities table may contain thousands of rows, but we are not fetching it yet, we are just building a query.
someEntities = someEntities.Where(s => s.Id > 100 && s.Id < 200); <-- creates expression tree with where statement. The query is not executed yet and data is not fetched on the client. We just tell EF to perform a where filter when query will execute. This statement too returns an IQueryable<> object.
var entities = someEntities.AsEnumerable(); <-- here we tell EF to execute query. now entities will be fetched and any additional linq query will be performed in memory.
you can also fetch the data using foreach, calling ToArray() or ToList<>.
Hope you understand what I mean, and sorry for my english :)

ADO.NET Mapping From SQLDataReader to Domain Object?

I have a very simple mapping function called "BuildEntity" that does the usual boring "left/right" coding required to dump my reader data into my domain object. (shown below) My question is this - If I don't bring back every column in this mapping as is, I get the "System.IndexOutOfRangeException" exception and wanted to know if ado.net had anything to correct this so I don't need to bring back every column with each call into SQL ...
What I'm really looking for is something like "IsValidColumn" so I can keep this 1 mapping function throughout my DataAccess class with all the left/right mappings defined - and have it work even when a sproc doesn't return every column listed ...
Using reader As SqlDataReader = cmd.ExecuteReader()
Dim product As Product
While reader.Read()
product = New Product()
product.ID = Convert.ToInt32(reader("ProductID"))
product.SupplierID = Convert.ToInt32(reader("SupplierID"))
product.CategoryID = Convert.ToInt32(reader("CategoryID"))
product.ProductName = Convert.ToString(reader("ProductName"))
product.QuantityPerUnit = Convert.ToString(reader("QuantityPerUnit"))
product.UnitPrice = Convert.ToDouble(reader("UnitPrice"))
product.UnitsInStock = Convert.ToInt32(reader("UnitsInStock"))
product.UnitsOnOrder = Convert.ToInt32(reader("UnitsOnOrder"))
product.ReorderLevel = Convert.ToInt32(reader("ReorderLevel"))
productList.Add(product)
End While
Also check out this extension method I wrote for use on data commands:
public static void Fill<T>(this IDbCommand cmd,
IList<T> list, Func<IDataReader, T> rowConverter)
{
using (var rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
list.Add(rowConverter(rdr));
}
}
}
You can use it like this:
cmd.Fill(products, r => r.GetProduct());
Where "products" is the IList<Product> you want to populate, and "GetProduct" contains the logic to create a Product instance from a data reader. It won't help with this specific problem of not having all the fields present, but if you're doing a lot of old-fashioned ADO.NET like this it can be quite handy.
Although connection.GetSchema("Tables") does return meta data about the tables in your database, it won't return everything in your sproc if you define any custom columns.
For example, if you throw in some random ad-hoc column like *SELECT ProductName,'Testing' As ProductTestName FROM dbo.Products" you won't see 'ProductTestName' as a column because it's not in the Schema of the Products table. To solve this, and ask for every column available in the returned data, leverage a method on the SqlDataReader object "GetSchemaTable()"
If I add this to the existing code sample you listed in your original question, you will notice just after the reader is declared I add a data table to capture the meta data from the reader itself. Next I loop through this meta data and add each column to another table that I use in the left-right code to check if each column exists.
Updated Source Code
Using reader As SqlDataReader = cmd.ExecuteReader()
Dim table As DataTable = reader.GetSchemaTable()
Dim colNames As New DataTable()
For Each row As DataRow In table.Rows
colNames.Columns.Add(row.ItemArray(0))
Next
Dim product As Product While reader.Read()
product = New Product()
If Not colNames.Columns("ProductID") Is Nothing Then
product.ID = Convert.ToInt32(reader("ProductID"))
End If
product.SupplierID = Convert.ToInt32(reader("SupplierID"))
product.CategoryID = Convert.ToInt32(reader("CategoryID"))
product.ProductName = Convert.ToString(reader("ProductName"))
product.QuantityPerUnit = Convert.ToString(reader("QuantityPerUnit"))
product.UnitPrice = Convert.ToDouble(reader("UnitPrice"))
product.UnitsInStock = Convert.ToInt32(reader("UnitsInStock"))
product.UnitsOnOrder = Convert.ToInt32(reader("UnitsOnOrder"))
product.ReorderLevel = Convert.ToInt32(reader("ReorderLevel"))
productList.Add(product)
End While
This is a hack to be honest, as you should return every column to hydrate your object correctly. But I thought to include this reader method as it would actually grab all the columns, even if they are not defined in your table schema.
This approach to mapping your relational data into your domain model might cause some issues when you get into a lazy loading scenario.
Why not just have each sproc return complete column set, using null, -1, or acceptable values where you don't have the data. Avoids having to catch IndexOutOfRangeException or re-writing everything in LinqToSql.
Use the GetSchemaTable() method to retrieve the metadata of the DataReader. The DataTable that is returned can be used to check if a specific column is present or not.
Why don't you use LinqToSql - everything you need is done automatically. For the sake of being general you can use any other ORM tool for .NET
If you don't want to use an ORM you can also use reflection for things like this (though in this case because ProductID is not named the same on both sides, you couldn't do it in the simplistic fashion demonstrated here):
List Provider in C#
I would call reader.GetOrdinal for each field name before starting the while loop. Unfortunately GetOrdinal throws an IndexOutOfRangeException if the field doesn't exist, so it won't be very performant.
You could probably store the results in a Dictionary<string, int> and use its ContainsKey method to determine if the field was supplied.
I ended up writing my own, but this mapper is pretty good (and simple): https://code.google.com/p/dapper-dot-net/