So I'm using Microsoft's Microsoft.Azure.DocumentDB.Core api to connect to my Cosmos DB from my .Net Core app.
Everything is working fine in that I can create, edit, and get documents, as well as create a new database or collection. If I create a database or collection using the DocumentDB.Core api, I can see them in my azure portal. However, when I create documents, I can not see them. Whenever I try to load my documents, I get this error a number of times equal to the number of documents I have.
Error while fetching page of documents {"code":400,"body":"Command find failed: Unknown server error occurred when processing this request.."}
I know I have existing records tho because if I save the Id I get back on creation, I can then look it up using the DocumentDB.Core api.
Here is my model that I am passing:
public class Api
{
public Api()
{
Client = new ExpandoObject();
}
[JsonProperty(PropertyName = "id")]
public string Id { get; set; }
public ExpandoObject Client { get; set; }
}
NOW, before I switched to DocumentDB.Core, I was using MongoDB.Driver, and this was my model:
public class Api
{
public Api()
{
Client = new ExpandoObject();
}
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
[BsonElement("client")]
public ExpandoObject Client { get; set; }
}
Using MongoDB.Driver, and the above model, I was able to see my data in the azure portal.
Is there a reason why I cannot see data in the Azure CosmosDB portal using Microsoft's own tool for it? Am I missing a property? The only thing I notice that is different is that using MongoDB.Drive, my id was _id in my created document, and using DocumentDB.Core my id is id in my created document. I'm not sure if that matters though, or how to address it.
Here is simillar issue you can take a look:
Importing Json using DocumentDB Data Migration tool gives "Error while fetching page of documents code: 400" in CosmosDB
Further to understand,Below could be the reason for error code 400:-
-The JSON, SQL, or JavaScript in the request body is invalid.
-In addition, a 400 can also be returned when the required properties of a resource are not present or set in the body of the POST or PUT on the resource.
-400 is also returned when the consistent level for a GET operation is overridden by a stronger consistency from the one set for the account.
-400 is also returned when a request that requires an x-ms-documentdb-partitionkey does not include it.
Hope it helps.
Related
We have web application which connects to SQL Azure DB. I have configured my application with Application ID and certificate. We would like to make use of Access Token Approach for connecting to SQL Server, As per below link, connecting through token approach to SQL Server is not a reliable approach. Any recommended way of connecting instead of User ID and Password.
Connect to Azure SQL using Azure Active Directory from an Azure Website?
Can anyone let me know if they have implemented SQL Azure DB AAD token based authentication using entity framework and is it right way for connecting.
According to your description, I followed the tutorial about using AAD Authentication for Azure SQL Database.
As this tutorial mentioned about Azure AD token authentication:
This authentication method allows middle-tier services to connect to Azure SQL Database or Azure SQL Data Warehouse by obtaining a token from Azure Active Directory (AAD). It enables sophisticated scenarios including certificate-based authentication.You must complete four basic steps to use Azure AD token authentication:
Register your application with Azure Active Directory and get the client id for your code.
Create a database user representing the application. (Completed earlier in step 6.)
Create a certificate on the client computer runs the application.
Add the certificate as a key for your application.
Then I followed the code sample in this blog for getting started with this feature, and it works as expected.
Can anyone let me know if they have implemented SQL Azure DB AAD token based authentication using entity framework and is it right way for connecting.
Based on the above code sample, I added EntityFramework 6.1.3 for implementing SQL Azure DB AAD token based authentication using entity framework. After some trials, I could make it work as expected. Here are some details, you could refer to them.
DbContext
public class BruceDbContext : DbContext
{
public BruceDbContext()
: base("name=defaultConnectionString")
{ }
public BruceDbContext(SqlConnection con) : base(con, true)
{
Database.SetInitializer<BruceDbContext>(null);
}
public virtual DbSet<User> Users { get; set; }
}
DataModel
[Table("Users")]
public class User
{
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public long Id { get; set; }
[StringLength(50)]
public string UserName { get; set; }
public DateTime CreateTime { get; set; }
}
Program.cs
class Program
{
static void Main()
{
SqlConnectionStringBuilder builder = new SqlConnectionStringBuilder();
builder["Data Source"] = "brucesqlserver.database.windows.net";
builder["Initial Catalog"] = "brucedb";
builder["Connect Timeout"] = 30;
string accessToken = TokenFactory.GetAccessToken();
if (accessToken == null)
{
Console.WriteLine("Fail to acuire the token to the database.");
}
using (SqlConnection connection = new SqlConnection(builder.ConnectionString))
{
try
{
connection.AccessToken = accessToken;
//working with EF
using (var model = new BruceDbContext(connection))
{
var users= model.Users.ToList();
Console.WriteLine($"Results:{Environment.NewLine}{JsonConvert.SerializeObject(users)}");
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
Console.WriteLine("Please press any key to stop");
Console.ReadKey();
}
}
Result
Note: The contained database user for your application principal via CREATE USER [mytokentest] FROM EXTERNAL PROVIDER does not has any permissions to access your database. You need to grant privileges for this user, for more details you could refer to this issue.
Additionally, when you construct the DbContextinstance, you need to implement the SqlConnection instance with a valid AccessToken. AFAIK, you need to handle the token refreshing when the token is expired.
I'm hoping this is a simple question. I've created an Azure Mobile Apps project based upon the sample ToDo project, adding my own tables/data objects. The problem I'm having is adding/POSTing records to a table that has a foreign key relationship to another. The following is my Employee table data object:
public class Employee : EntityData
{
public string Name { get; set; }
public string EmailAddress { get; set; }
public bool IsActive { get; set; }
public string EmployeeTypeId { get; set; }
public virtual EmployeeType EmployeeType { get; set; }
}
...and this is my EmployeeType data object:
public class EmployeeType : EntityData
{
public string EmpType { get; set; }
public bool IsActive { get; set; }
}
The virtual EmployeeType property in the Employee class was necessary, I believe, to create the relationship with the EmployeeType table when using EF Code First to create the tables in the database. (At least, that's what I understand, and it worked) I am able to insert records from my Xamarin client app into the EmployeeType table using the InsertAsync method, but I receive a "Bad Request" 400 error when trying to insert into the Employee table.
I've looked around quite a bit for solutions, but everything refers to Azure Mobile Services and not Apps. If need be, I can update this question with my client side model classes (I'm on my PC now and don't have access to the Xamarin Studio project on my Mac). For reference, these classes are pretty much the same as the data objects - just each property is decorated with the JsonProperty attribute, except the virtual property outlined in the service. And for completeness, I did try adding that property to the client object and it still threw the "Bad Request" 400 error.
Thanks for any direction you can offer me.
Most likely, the problem is happening when trying to map the foreign key. Are you specifying all of the fields for employee type? I recommend that you do the following:
Use Fiddler or attach a delegating handler to your client to see what the outgoing request looks like. Update your comment with the JSON body. See https://github.com/Azure/azure-mobile-apps/wiki/Help,-my-app-isn't-working!#log-outgoing-requests-in-managed-client-xamarin-windows.
Attach a debugger to your server project. You can do this while running locally or after your solution is deployed to Azure, but you'll have better performance if you run locally. See https://github.com/Azure/azure-mobile-apps/wiki/Help,-my-app-isn't-working!#remote-debugging-net-server-sdk.
I suspect that the problem is that EmployeeType ends up being null in your deserialized object, and then Entity Framework rejects the DB insert.
Could you get more information from the bad request? Try adding this to the table controller
protected override void Initialize(HttpControllerContext controllerContext)
{
controllerContext.Configuration.IncludeErrorDetailPolicy = IncludeErrorDetailPolicy.Always;
}
I'm currently developing a side-project, which will consist of a Database, Web API, and then different Apps on top which consume the restful API.
I've started thinking about User accounts and how to make these secure. Currently, as a standard, I have the following model in the Data Layer:
public int Id { get; set; }
public string Username { get; set; }
public string Full_Name { get; set; }
public string Password { get; internal set; }
public string Salt { get; internal set; }
Now, obviously, when someone makes a request for /Users/{id}, the User associated with that Id is returned. However, I don't want to return the Password or Salt, so really don't want those to be part of the User model.
I have toyed with the idea of creating a different, internal-only model for UserDetails, and shipping the Password/Salt, etc off into that. However, I hit the snag of, when signing up to the service, how do I get a desired password from the user to the API?
There's probably a really simple implementation of what I want to do, but I can't think of one right now. Any help would be greatly appreciated!
You should use a special view model without the password property for this purpose. Then inside your API you will map between your data model and the view model and return the view model out from your Web API method. Same stands true for the Salt property as well.
I have an application that I developed standalone and now am trying to integrate into a much larger model. Currently, on the server side, there are 11 tables and an average of three navigation properties per table. This is working well and stable.
The larger model has 55 entities and 180+ relationships and includes most of my model (less the relationships to tables in the larger model). Once integrated, a very strange thing happens: the server sends the same data, the same number of entities are returned, but the exportEntities function returns a string of about 150KB (rather than the 1.48 MB it was returning before) and all queries show a tenth of the data they were showing before.
I followed the troubleshooting information on the Breeze website. I looked through the Breeze metadata and the entities and relationships seem defined correctly. I looked at the data that was returned and 9 out of ten entities did not appear as an object, but as a function: function (){return e.refMap[t]} which, when I expand it, has an 'arguments' property: Exception: TypeError: 'caller', 'callee', and 'arguments' properties may not be accessed on strict mode functions or the arguments objects for calls to them.
For reference, here are the two entities involved in the breaking change.
The Repayments Entity
public class Repayment
{
[Key, Column(Order = 0)]
public int DistrictId { get; set; }
[Key, Column(Order = 1)]
public int RepaymentId { get; set; }
public int ClientId { get; set; }
public int SeasonId { get; set; }
...
#region Navigation Properties
[InverseProperty("Repayments")]
[ForeignKey("DistrictId")]
public virtual District District { get; set; }
// The three lines below are the lines I added to break the results
// If I remove them again, the results are correct again
[InverseProperty("Repayments")]
[ForeignKey("DistrictId,ClientId")]
public virtual Client Client { get; set; }
[InverseProperty("Repayments")]
[ForeignKey("DistrictId,SeasonId,ClientId")]
public virtual SeasonClient SeasonClient { get; set; }
The Client Entity
public class Client : IClient
{
[Key, Column(Order = 0)]
public int DistrictId { get; set; }
[Key, Column(Order = 1)]
public int ClientId { get; set; }
....
// This Line lines were in the original (working) model
[InverseProperty("Client")]
public virtual ICollection<Repayment> Repayments { get; set; }
....
}
The relationship that I restored was simply the inverse of a relationship that was already there, which is one of the really weird things about it. I'm sure I'm doing something terribly wrong, but I'm not even sure at this point what information might be helpful in debugging this.
For defining foreign keys and inverse properties, I assume I must use either data annotations or the FluentAPI even if the tables follow all the EF conventions. Is either one better than the other? Is it necessary to consistently choose one approach and stay with it? Does the error above provide any insight as to what I might be doing wrong? Is there any other information I could post that might be helpful?
Breeze is an excellent framework and has the potential to really increase our reach providing assistance to small farmers in rural East Africa, and I'd love to get this prototype working.
THanks
Ok, some of what you are describing can be explained by breeze's default behavior of compressing the payload of any query results that return multiple instances of the same entity. If you are using something like the default 'json.net' assembly for serialization, then each entity is sent with an extra '$id' property and if the same entity is seen again it gets serialized via a simple '$ref' property with the value of the previously mentioned '$id'.
On the breeze client during deserialization these '$refs' get resolved back into full entities. However, because the order in which deserialization is performed may not be the same as the order that serialization might have been performed, breeze internally creates deferred closure functions ( with no arguments) that allow for the deferred resolution of the compressed results regardless of the order of serialization. This is the
function (){return e.refMap[t]}
that you are seeing.
If you are seeing this value as part of the actual top level query result, then we have a bug, but if you are seeing this value while debugging the results returned from your server, before they have been returned to the calling function, then this is completely expected ( especially if you are viewing the contents of the closure before it should be executed.)
So a couple of questions and suggestions
Are you are actually seeing an error processing the result of your query or are simply surprised that the results are so small? If it's just a size issue, check and see if you can identify data that should have been sent to the client and is missing. It is possible that the reference compression is simply very effective in your case.
take a look at the 'raw' data returned from your web service. It should look something like this, with '$id' and '$ref' properties.
[{
'$id': '1',
'Name': 'James',
'BirthDate': '1983-03-08T00:00Z',
},
{
'$ref': '1'
}]
if so, then look at the data and make sure that an '$'id' exists that correspond to each of your '$refs'. If not, something is wrong with your server side serialization code. If the data does not look like this, then please post back with a small example of what the 'raw' data does look like.
After looking at your Gist, I think I see the issue. Your metadata is out of sync with the actual results returned by your query. In particular, if you look for the '$id' value of "17" in your actual results you'll notice that it is first found in the 'Client' property of the 'Repayment' type, but your metadata doesn't have 'Client' navigation property defined for the 'Repayment' type ( there is a 'ClientId' ). My guess is that you are reusing an 'older' version of your metadata.
The reason that this results in incomplete results is that once breeze determines that it is deserializing an 'entity' ( i.e. a json object that has $type property that maps to an actual entityType), it only attempts to deserialize the 'known' properties of this type, i.e. those found in the metadata. In your case, the 'Client' navigation property on the 'Repayment' type was never being deserialized, and any refs to the '$id' defined there are therefore not available.
Say you have the following Contact DTO. Address/PhoneNumber/EmailAddress/WebSiteAddress classes are simple DTOs as well (just data no behavior)
public class Contact
{
public Address[] Addresses { get; set; }
public PhoneNumber[] PhoneNumbers { get; set; }
public EmailAddress[] EmailAddresses { get; set; }
public WebSiteAddress[] WebSiteAddresses { get; set; }
}
How should I model DTOs to allow implementing the following behavior?
The client can submit a request that will
add a phone number, update two phone numbers and delete two add two
add two email addresses, update one email address and delete three
add three website addresses, update two website addresses and delete
two. You get the idea.
One option is to add an Action attribute to each Address / PhoneNumber / EmailAddress / WebSiteAddress.
Then the code the update addresses look like this:
var addressesToUpdate = serviceContact.Addresses.Where(x => x.AddressAction.ToUpper() == "UPDATE");
var addressesToAdd = serviceContact.Addresses.Where(x => x.AddressAction.ToUpper() == "ADD");
var addressesToDelete = serviceContact.Addresses.Where(x => x.AddressAction.ToUpper() == "DELETE").Select(x => x.AddressId);
Repeating this for all other lists will probably create duplication.
My question is:
How should I model service DTOs with updatable lists while avoiding duplication?
Generally I'll try to keep my writes idempotent which means it should have the same side-effect (i.e. end result) of calling it when you have no records or all records (i.e. Store or Update).
Basically this means that the client sends the complete state: i.e.
What entries don't exist => gets created,
The entities that already exist => get updated,
Whilst the entities that aren't in the request DTO => get deleted.
OrmLite's db.Save() command has nice support for this where it detects if a record(s) already exist and will issue an UPDATE otherwise will INSERT.
You can use ETags with conditional requests instead of providing the complete state. Use the ETag as a version of the list and change it each time the list changes. On the client side, use the ETag to request an update using the If-None-Match http header and be prepared to receive a 402 Precondition Failed status if the list changed while the request was sent.