I'm doing refactoring on production database and need to make some renamings. Version of mongodb is 1.8.0. I use C# driver to do refactoring of database. Have faced with problem when I try to rename field of complex type that is located in array.
For example I have such document:
FoobarCollection:
{
Field1: "",
Field2: [
{ NestedField1: "", NestedField2: "" },
{ NestedField1: "", NestedField2: "" },
...
]
}
I Need to rename NestedField2 into NestedField3, for example.
MongoDB documentation says:
$rename
Version 1.7.2+ only.
{ $rename : { old_field_name : new_field_name } }
Renames the field with name 'old_field_name' to 'new_field_name'. Does not expand arrays to find a match for 'old_field_name'.
As I understand, simply using Update.Rename() wouldn't give result, because as documentation says "rename - doesn't expand arrays to find a match for old field name"
What C# code I should write to rename NestedField2 into NestedField3?
I have implemented special type to do renaming of arbitrary field in MongoDB. Here is it:
using System.Linq;
using MongoDB.Bson;
using MongoDB.Driver;
namespace DatabaseManagementTools
{
public class MongoDbRefactorer
{
protected MongoDatabase MongoDatabase { get; set; }
public MongoDbRefactorer(MongoDatabase mongoDatabase)
{
MongoDatabase = mongoDatabase;
}
/// <summary>
/// Renames field
/// </summary>
/// <param name="collectionName"></param>
/// <param name="oldFieldNamePath">Supports nested types, even in array. Separate nest level with '$': "FooField1$FooFieldNested$FooFieldNestedNested"</param>
/// <param name="newFieldName">Specify only field name without path to it: "NewFieldName", but not "FooField1$NewFieldName"</param>
public void RenameField(string collectionName, string oldFieldNamePath, string newFieldName)
{
MongoCollection<BsonDocument> mongoCollection = MongoDatabase.GetCollection(collectionName);
MongoCursor<BsonDocument> collectionCursor = mongoCollection.FindAll();
PathSegments pathSegments = new PathSegments(oldFieldNamePath);
// Rename field in each document of collection
foreach (BsonDocument document in collectionCursor)
{
int currentSegmentIndex = 0;
RenameField(document, pathSegments, currentSegmentIndex, newFieldName);
// Now document is modified in memory - replace old document with new in mongo:
mongoCollection.Save(document);
}
}
private void RenameField(BsonValue bsonValue, PathSegments pathSegments, int currentSegmentIndex, string newFieldName)
{
string currentSegmentName = pathSegments[currentSegmentIndex];
if (bsonValue.IsBsonArray)
{
var array = bsonValue.AsBsonArray;
foreach (var arrayElement in array)
{
RenameField(arrayElement.AsBsonDocument, pathSegments, currentSegmentIndex, newFieldName);
}
return;
}
bool isLastNameSegment = pathSegments.Count() == currentSegmentIndex + 1;
if (isLastNameSegment)
{
RenameDirect(bsonValue, currentSegmentName, newFieldName);
return;
}
var innerDocument = bsonValue.AsBsonDocument[currentSegmentName];
RenameField(innerDocument, pathSegments, currentSegmentIndex + 1, newFieldName);
}
private void RenameDirect(BsonValue document, string from, string to)
{
BsonElement bsonValue;
bool elementFound = document.AsBsonDocument.TryGetElement(from, out bsonValue);
if (elementFound)
{
document.AsBsonDocument.Add(to, bsonValue.Value);
document.AsBsonDocument.Remove(from);
}
else
{
// todo: log missing elements
}
}
}
}
And helper type to keep path segments:
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
namespace DatabaseManagementTools
{
public class PathSegments : IEnumerable<string>
{
private List<string> Segments { get; set; }
/// <summary>
/// Split segment levels with '$'. For example: "School$CustomCodes"
/// </summary>
/// <param name="pathToParse"></param>
public PathSegments(string pathToParse)
{
Segments = ParseSegments(pathToParse);
}
private static List<string> ParseSegments(string oldFieldNamePath)
{
string[] pathSegments = oldFieldNamePath.Trim(new []{'$', ' '})
.Split(new [] {'$'}, StringSplitOptions.RemoveEmptyEntries);
return pathSegments.ToList();
}
public IEnumerator<string> GetEnumerator()
{
return Segments.GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
public string this[int index]
{
get { return Segments[index]; }
}
}
}
To separate nest levels I use '$' sign - the only sign that is forbidden for collection names in mongo.
Usage can be something like this:
MongoDbRefactorer mongoDbRefactorer = new MongoDbRefactorer(Mongo.Database);
mongoDbRefactorer.RenameField("schools", "FoobarTypesCustom$FoobarDefaultName", "FoobarName");
This code will find in collection schools FoobarTypesCustom property. It can be as complex type so array. Then will find all FoobarDefaultName properties (if FoobarTypesCustom is array then it will iterate through it) and rename it to FoobarName. Nesting levels and number of nested arrays no matters.
Related
I'm reverse engineering a file format that stores each field as TLV blocks (type, length, value).
The fields do not have to be in order, or even present at all. Their presence is denoted with a sentinel, which is a 16-bit type identifier and a 32-bit end offset. There are hundreds of unique identifiers, but a decent chunk of those are just single primitive values. aside from denoting the type, they can also identify what field the data should be stored in.
It is also worth noting that there will never be a duplicate id on a parent structure. The only time is can occur is if there are multiple of the same object type in an array/list.
I have successfully written a Kaitai definition for one of them:
meta:
id: struct_02ea
endian: le
seq:
- id: unk_00
type: s4
- id: fields
type: field_block
repeat: eos
types:
sentinel:
seq:
- id: id
type: u2
- id: end_offset
type: u4
field_block:
seq:
- id: sentinel
type: sentinel
- id: value
type:
switch-on: sentinel.id
cases:
0xF0: u1
0xF1: u1
0xF2: u1
0xF3: u1
0xF4: u4
0xF5: u4
size: sentinel.end_offset - _root._io.pos
Handling things this way does work, and I could likely map out the entire format like this. However, when it comes time to compiling this definition into another format, things get nasty.
Since I am wrapping each field in a field_block, the generated code stores these values in that type of object. This is incredibly inefficient when half of the generated field_block objects store a single integer. It would also require the consuming code to iterate through a list of each field block in order to get the actual field's value.
Ideally, I would like to define this structure so that the sentinels are only parsed while Kaitai is reading the data, and each value would be mapped to a field on the parent structure.
Is this possible? This technology is really cool, and I'd love to use it in my project, but I feel like the overhead that this is generating is a lot more trouble than it's worth.
Here's an example of the definition when compiled into C#:
using System.Collections.Generic;
namespace Kaitai
{
public partial class Struct02ea : KaitaiStruct
{
public static Struct02ea FromFile(string fileName)
{
return new Struct02ea(new KaitaiStream(fileName));
}
public Struct02ea(KaitaiStream p__io, KaitaiStruct p__parent = null, Struct02ea p__root = null) : base(p__io)
{
m_parent = p__parent;
m_root = p__root ?? this;
_read();
}
private void _read()
{
_unk00 = m_io.ReadS4le();
_fields = new List<FieldBlock>();
{
var i = 0;
while (!m_io.IsEof) {
_fields.Add(new FieldBlock(m_io, this, m_root));
i++;
}
}
}
public partial class Sentinel : KaitaiStruct
{
public static Sentinel FromFile(string fileName)
{
return new Sentinel(new KaitaiStream(fileName));
}
public Sentinel(KaitaiStream p__io, Struct02ea.FieldBlock p__parent = null, Struct02ea p__root = null) : base(p__io)
{
m_parent = p__parent;
m_root = p__root;
_read();
}
private void _read()
{
_id = m_io.ReadU2le();
_endOffset = m_io.ReadU4le();
}
private ushort _id;
private uint _endOffset;
private Struct02ea m_root;
private Struct02ea.FieldBlock m_parent;
public ushort Id { get { return _id; } }
public uint EndOffset { get { return _endOffset; } }
public Struct02ea M_Root { get { return m_root; } }
public Struct02ea.FieldBlock M_Parent { get { return m_parent; } }
}
public partial class FieldBlock : KaitaiStruct
{
public static FieldBlock FromFile(string fileName)
{
return new FieldBlock(new KaitaiStream(fileName));
}
public FieldBlock(KaitaiStream p__io, Struct02ea p__parent = null, Struct02ea p__root = null) : base(p__io)
{
m_parent = p__parent;
m_root = p__root;
_read();
}
private void _read()
{
_sentinel = new Sentinel(m_io, this, m_root);
switch (Sentinel.Id) {
case 243: {
_value = m_io.ReadU1();
break;
}
case 244: {
_value = m_io.ReadU4le();
break;
}
case 245: {
_value = m_io.ReadU4le();
break;
}
case 241: {
_value = m_io.ReadU1();
break;
}
case 240: {
_value = m_io.ReadU1();
break;
}
case 242: {
_value = m_io.ReadU1();
break;
}
default: {
_value = m_io.ReadBytes((Sentinel.EndOffset - M_Root.M_Io.Pos));
break;
}
}
}
private Sentinel _sentinel;
private object _value;
private Struct02ea m_root;
private Struct02ea m_parent;
public Sentinel Sentinel { get { return _sentinel; } }
public object Value { get { return _value; } }
public Struct02ea M_Root { get { return m_root; } }
public Struct02ea M_Parent { get { return m_parent; } }
}
private int _unk00;
private List<FieldBlock> _fields;
private Struct02ea m_root;
private KaitaiStruct m_parent;
public int Unk00 { get { return _unk00; } }
public List<FieldBlock> Fields { get { return _fields; } }
public Struct02ea M_Root { get { return m_root; } }
public KaitaiStruct M_Parent { get { return m_parent; } }
}
}
Affiliate disclaimer: I'm a Kaitai Struct maintainer (see my GitHub profile).
Since I am wrapping each field in a field_block, the generated code stores these values in that type of object. This is incredibly inefficient when half of the generated field_block objects store a single integer. It would also require the consuming code to iterate through a list of each field block in order to get the actual field's value.
I think that rather than trying to describe the entire format with an ultimate Kaitai Struct specification, it's better for you not to let the generated code parse all the fields automatically. Move the parsing control to your application code, where you use the type Struct02ea.FieldBlock that represents the individual field and basically replicate the "repeat until end of stream" loop that the generated code that you posted was doing:
_fields = new List<FieldBlock>();
{
var i = 0;
while (!m_io.IsEof) {
_fields.Add(new FieldBlock(m_io, this, m_root));
i++;
}
}
The advantage of doing so is that you can adjust the loop to fit your needs. To avoid the overhead you describe, you'll probably want to keep the Struct02ea.FieldBlock object in a local variable inside the loop body, pull only the values you care about (save them in your compact, consumer-friendly output structures) and let it leave the scope after the loop iteration ends. This will allow each original FieldBlock object to get garbage-collected once you process it, so the overhead they have will be limited to a single instance and not multiplied by the number of fields in the file.
The most straightforward and seamless way to prevent the Kaitai Struct-generated code parse fields (but otherwise keep everything the same) is to add if: false in the KSY specification, as #webbnh suggested in a GitHub issue:
seq:
- id: unk_00
type: s4
- id: fields
type: field_block
repeat: eos
if: false # add this
The if: false works better than omitting it from seq entirely, because the kaitai-struct-compiler has occasional troubles with unused types (when compiling the KSY spec with unused types, you may get an error "Unable to derive _parent type in ..." due to a compiler bug). But with this if: false trick, you can't run into them because the field_block type is no longer unused.
I´m learning how to use CustomScalar in graphql-dotnet.
I have a tinyint column in my table and from what I have read, I´m supposed to use byte on this column in C#. After research I found out that I need to create a ByteGraphType, but I´m having trouble doing that.
I got the ByteGraphType example from this link https://github.com/graphql-dotnet/graphql-dotnet/issues/458, so I think it will work.
With this code, I can query the table, however, my mutation is not working. I didn´t find an example to demonstrate how the mutation would look like with a byte column. I tried as is stated in my code example, but in this line (var avaliacao = context.GetArgument("avaliacao");), my argument avaliacao.Nota is returning null and I´m not sure on how to proceed.
Can someone help me?
Thank you
THAT´S MY CODE
//Model
[Column("nota")]
public byte Nota { get; set; }
//Type
Field<ByteGraphType>("Nota", resolve: context => context.Source.Nota);
//InputType
Field<ByteGraphType>("nota");
//Query
Field<ListGraphType<AvaliacaoType>>(
"avaliacoes",
resolve: context => contextServiceLocator.AvaliacaoRepository.All());
//Mutation
Field<AvaliacaoType>(
"createAvaliacao",
arguments: new QueryArguments(
new QueryArgument<NonNullGraphType<AvaliacaoInputType>> { Name = "avaliacao" }
),
resolve: context =>
{
var schema = new Schema();
schema.RegisterValueConverter(new ByteValueConverter());
var avaliacao = context.GetArgument<Avaliacao>("avaliacao");
avaliacao.Nota.AstFromValue(schema, new ByteGraphType());
return contextServiceLocator.AvaliacaoRepository.Add(avaliacao);
});
//ByteGraphType
using GraphQL.Language.AST;
using GraphQL.Types;
using System;
namespace Api.Helpers
{
public class ByteGraphType : ScalarGraphType
{
public ByteGraphType()
{
Name = "Byte";
}
public override object ParseLiteral(IValue value)
{
var byteVal = value as ByteValue;
return byteVal?.Value;
}
public override object ParseValue(object value)
{
if (value == null)
return null;
try
{
var result = Convert.ToByte(value);
return result;
}
catch (FormatException)
{
return null;
}
}
public override object Serialize(object value)
{
return ParseValue(value).ToString();
}
public class ByteValueConverter : IAstFromValueConverter
{
public bool Matches(object value, IGraphType type)
{
return value is byte;
}
public IValue Convert(object value, IGraphType type)
{
return new ByteValue((byte)value);
}
}
public class ByteValue : ValueNode<byte>
{
public ByteValue(byte value)
{
Value = value;
}
protected override bool Equals(ValueNode<byte> node)
{
return Value == node.Value;
}
}
}
}
What I need is to be able to save a record of a table that has a tinyint column. If I change the type in my code to int, I can mutate, but can´t query.
I changed my CustomScalar and it worked:
using GraphQL.Language.AST;
using GraphQL.Types;
using System;
namespace Api.Helpers
{
public class ByteGraphType : ScalarGraphType
{
public ByteGraphType()
{
Name = "Byte";
Description = "ByteGraphType";
}
/// <inheritdoc />
public override object Serialize(object value)
{
return ParseValue(value).ToString();
}
/// <inheritdoc />
public override object ParseValue(object value)
{
byte result;
if (byte.TryParse(value?.ToString() ?? string.Empty, out result))
{
return result;
}
return null;
}
/// <inheritdoc />
public override object ParseLiteral(IValue value)
{
if (value is StringValue)
{
return ParseValue(((StringValue)value).Value);
}
return null;
}
}
}
the SDK 7.x is not working on Unity 5.1.0f3 , i always got the error version not found .
does someone have see this error ?
Actually, it's just a warning. But you can fix it.
There are several places where facebook plugin calls
FBBuildVersionAttribute.GetVersionAttributeOfType(typeof(AbstractFacebook));
So first, you need to modify FBBuildVersionAttribute to this:
// we are going to apply this attribute to Class
// instead of Assembly
// also make it inheritable for all implementations
[AttributeUsage(AttributeTargets.Class, Inherited = true)]
public class FBBuildVersionAttribute : Attribute
{
private DateTime buildDate;
private string buildHash;
private string buildVersion;
private string sdkVersion;
public DateTime Date { get { return buildDate; } }
public string Hash { get { return buildHash; } }
public string SdkVersion { get { return sdkVersion; } }
public string BuildVersion { get { return buildVersion; } }
public FBBuildVersionAttribute(string sdkVersion, string buildVersion)
{
this.buildVersion = buildVersion;
var parts = buildVersion.Split('.');
buildDate = DateTime.ParseExact(parts[0], "yyMMdd", System.Globalization.CultureInfo.InvariantCulture);
buildHash = parts[1];
this.sdkVersion = sdkVersion;
}
public override string ToString()
{
return buildVersion;
}
public static FBBuildVersionAttribute GetVersionAttributeOfType(Type type)
{
foreach (FBBuildVersionAttribute attribute in getAttributes(type))
{
return attribute;
}
return null;
}
private static FBBuildVersionAttribute[] getAttributes(Type type)
{
if (type == null)
throw new ArgumentNullException("type");
// we want to get attributes from type instead of assmebly
return (FBBuildVersionAttribute[])(type.GetCustomAttributes(typeof(FBBuildVersionAttribute), false));
}
}
No you just need to add this attribute to AbstractFacebook:
[FBBuildVersionAttribute("7.0.1", "150604.98558e55096475c")]
public abstract class AbstractFacebook : MonoBehaviour
{
// ...
}
Note that 98558e55096475c part is trash string. It's not actual build hash, cause I don't have one.
Get the latest version of the FB Unity SDK. Change log says it's fixed now.
https://developers.facebook.com/docs/unity/change-log
I'm doing refactoring on production database and need to make some renamings. Version of mongodb is 1.8.0. I use C# driver to do refactoring of database. Have faced with problem when I try to rename field of complex type that is located in array.
For example I have such document:
FoobarCollection:
{
Field1: "",
Field2: [
{ NestedField1: "", NestedField2: "" },
{ NestedField1: "", NestedField2: "" },
...
]
}
I Need to rename NestedField2 into NestedField3, for example.
MongoDB documentation says:
$rename
Version 1.7.2+ only.
{ $rename : { old_field_name : new_field_name } }
Renames the field with name 'old_field_name' to 'new_field_name'. Does not expand arrays to find a match for 'old_field_name'.
As I understand, simply using Update.Rename() wouldn't give result, because as documentation says "rename - doesn't expand arrays to find a match for old field name"
What C# code I should write to rename NestedField2 into NestedField3?
I have implemented special type to do renaming of arbitrary field in MongoDB. Here is it:
using System.Linq;
using MongoDB.Bson;
using MongoDB.Driver;
namespace DatabaseManagementTools
{
public class MongoDbRefactorer
{
protected MongoDatabase MongoDatabase { get; set; }
public MongoDbRefactorer(MongoDatabase mongoDatabase)
{
MongoDatabase = mongoDatabase;
}
/// <summary>
/// Renames field
/// </summary>
/// <param name="collectionName"></param>
/// <param name="oldFieldNamePath">Supports nested types, even in array. Separate nest level with '$': "FooField1$FooFieldNested$FooFieldNestedNested"</param>
/// <param name="newFieldName">Specify only field name without path to it: "NewFieldName", but not "FooField1$NewFieldName"</param>
public void RenameField(string collectionName, string oldFieldNamePath, string newFieldName)
{
MongoCollection<BsonDocument> mongoCollection = MongoDatabase.GetCollection(collectionName);
MongoCursor<BsonDocument> collectionCursor = mongoCollection.FindAll();
PathSegments pathSegments = new PathSegments(oldFieldNamePath);
// Rename field in each document of collection
foreach (BsonDocument document in collectionCursor)
{
int currentSegmentIndex = 0;
RenameField(document, pathSegments, currentSegmentIndex, newFieldName);
// Now document is modified in memory - replace old document with new in mongo:
mongoCollection.Save(document);
}
}
private void RenameField(BsonValue bsonValue, PathSegments pathSegments, int currentSegmentIndex, string newFieldName)
{
string currentSegmentName = pathSegments[currentSegmentIndex];
if (bsonValue.IsBsonArray)
{
var array = bsonValue.AsBsonArray;
foreach (var arrayElement in array)
{
RenameField(arrayElement.AsBsonDocument, pathSegments, currentSegmentIndex, newFieldName);
}
return;
}
bool isLastNameSegment = pathSegments.Count() == currentSegmentIndex + 1;
if (isLastNameSegment)
{
RenameDirect(bsonValue, currentSegmentName, newFieldName);
return;
}
var innerDocument = bsonValue.AsBsonDocument[currentSegmentName];
RenameField(innerDocument, pathSegments, currentSegmentIndex + 1, newFieldName);
}
private void RenameDirect(BsonValue document, string from, string to)
{
BsonElement bsonValue;
bool elementFound = document.AsBsonDocument.TryGetElement(from, out bsonValue);
if (elementFound)
{
document.AsBsonDocument.Add(to, bsonValue.Value);
document.AsBsonDocument.Remove(from);
}
else
{
// todo: log missing elements
}
}
}
}
And helper type to keep path segments:
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
namespace DatabaseManagementTools
{
public class PathSegments : IEnumerable<string>
{
private List<string> Segments { get; set; }
/// <summary>
/// Split segment levels with '$'. For example: "School$CustomCodes"
/// </summary>
/// <param name="pathToParse"></param>
public PathSegments(string pathToParse)
{
Segments = ParseSegments(pathToParse);
}
private static List<string> ParseSegments(string oldFieldNamePath)
{
string[] pathSegments = oldFieldNamePath.Trim(new []{'$', ' '})
.Split(new [] {'$'}, StringSplitOptions.RemoveEmptyEntries);
return pathSegments.ToList();
}
public IEnumerator<string> GetEnumerator()
{
return Segments.GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
public string this[int index]
{
get { return Segments[index]; }
}
}
}
To separate nest levels I use '$' sign - the only sign that is forbidden for collection names in mongo.
Usage can be something like this:
MongoDbRefactorer mongoDbRefactorer = new MongoDbRefactorer(Mongo.Database);
mongoDbRefactorer.RenameField("schools", "FoobarTypesCustom$FoobarDefaultName", "FoobarName");
This code will find in collection schools FoobarTypesCustom property. It can be as complex type so array. Then will find all FoobarDefaultName properties (if FoobarTypesCustom is array then it will iterate through it) and rename it to FoobarName. Nesting levels and number of nested arrays no matters.
Like several other people, I'm having problems serializing Entity Framework objects, so that I can send the data over AJAX in a JSON format.
I've got the following server-side method, which I'm attempting to call using AJAX through jQuery
[WebMethod]
public static IEnumerable<Message> GetAllMessages(int officerId)
{
SIBSv2Entities db = new SIBSv2Entities();
return (from m in db.MessageRecipients
where m.OfficerId == officerId
select m.Message).AsEnumerable<Message>();
}
Calling this via AJAX results in this error:
A circular reference was detected while serializing an object of type \u0027System.Data.Metadata.Edm.AssociationType
Which is because of the way the Entity Framework creates circular references to keep all the objects related and accessible server side.
I came across the following code from (http://hellowebapps.com/2010-09-26/producing-json-from-entity-framework-4-0-generated-classes/) which claims to get around this problem by capping the maximum depth for references. I've added the code below, because I had to tweak it slightly to get it work (All angled brackets are missing from the code on the website)
using System.Web.Script.Serialization;
using System.Collections.Generic;
using System.Collections;
using System.Linq;
using System;
public class EFObjectConverter : JavaScriptConverter
{
private int _currentDepth = 1;
private readonly int _maxDepth = 2;
private readonly List<int> _processedObjects = new List<int>();
private readonly Type[] _builtInTypes = new[]{
typeof(bool),
typeof(byte),
typeof(sbyte),
typeof(char),
typeof(decimal),
typeof(double),
typeof(float),
typeof(int),
typeof(uint),
typeof(long),
typeof(ulong),
typeof(short),
typeof(ushort),
typeof(string),
typeof(DateTime),
typeof(Guid)
};
public EFObjectConverter( int maxDepth = 2,
EFObjectConverter parent = null)
{
_maxDepth = maxDepth;
if (parent != null)
{
_currentDepth += parent._currentDepth;
}
}
public override object Deserialize( IDictionary<string,object> dictionary, Type type, JavaScriptSerializer serializer)
{
return null;
}
public override IDictionary<string,object> Serialize(object obj, JavaScriptSerializer serializer)
{
_processedObjects.Add(obj.GetHashCode());
Type type = obj.GetType();
var properties = from p in type.GetProperties()
where p.CanWrite &&
p.CanWrite &&
_builtInTypes.Contains(p.PropertyType)
select p;
var result = properties.ToDictionary(
property => property.Name,
property => (Object)(property.GetValue(obj, null)
== null
? ""
: property.GetValue(obj, null).ToString().Trim())
);
if (_maxDepth >= _currentDepth)
{
var complexProperties = from p in type.GetProperties()
where p.CanWrite &&
p.CanRead &&
!_builtInTypes.Contains(p.PropertyType) &&
!_processedObjects.Contains(p.GetValue(obj, null)
== null
? 0
: p.GetValue(obj, null).GetHashCode())
select p;
foreach (var property in complexProperties)
{
var js = new JavaScriptSerializer();
js.RegisterConverters(new List<JavaScriptConverter> { new EFObjectConverter(_maxDepth - _currentDepth, this) });
result.Add(property.Name, js.Serialize(property.GetValue(obj, null)));
}
}
return result;
}
public override IEnumerable<System.Type> SupportedTypes
{
get
{
return GetType().Assembly.GetTypes();
}
}
}
However even when using that code, in the following way:
var js = new System.Web.Script.Serialization.JavaScriptSerializer();
js.RegisterConverters(new List<System.Web.Script.Serialization.JavaScriptConverter> { new EFObjectConverter(2) });
return js.Serialize(messages);
I'm still seeing the A circular reference was detected... exception being thrown!
I solved these issues with the following classes:
public class EFJavaScriptSerializer : JavaScriptSerializer
{
public EFJavaScriptSerializer()
{
RegisterConverters(new List<JavaScriptConverter>{new EFJavaScriptConverter()});
}
}
and
public class EFJavaScriptConverter : JavaScriptConverter
{
private int _currentDepth = 1;
private readonly int _maxDepth = 1;
private readonly List<object> _processedObjects = new List<object>();
private readonly Type[] _builtInTypes = new[]
{
typeof(int?),
typeof(double?),
typeof(bool?),
typeof(bool),
typeof(byte),
typeof(sbyte),
typeof(char),
typeof(decimal),
typeof(double),
typeof(float),
typeof(int),
typeof(uint),
typeof(long),
typeof(ulong),
typeof(short),
typeof(ushort),
typeof(string),
typeof(DateTime),
typeof(DateTime?),
typeof(Guid)
};
public EFJavaScriptConverter() : this(1, null) { }
public EFJavaScriptConverter(int maxDepth = 1, EFJavaScriptConverter parent = null)
{
_maxDepth = maxDepth;
if (parent != null)
{
_currentDepth += parent._currentDepth;
}
}
public override object Deserialize(IDictionary<string, object> dictionary, Type type, JavaScriptSerializer serializer)
{
return null;
}
public override IDictionary<string, object> Serialize(object obj, JavaScriptSerializer serializer)
{
_processedObjects.Add(obj.GetHashCode());
var type = obj.GetType();
var properties = from p in type.GetProperties()
where p.CanRead && p.GetIndexParameters().Count() == 0 &&
_builtInTypes.Contains(p.PropertyType)
select p;
var result = properties.ToDictionary(
p => p.Name,
p => (Object)TryGetStringValue(p, obj));
if (_maxDepth >= _currentDepth)
{
var complexProperties = from p in type.GetProperties()
where p.CanRead &&
p.GetIndexParameters().Count() == 0 &&
!_builtInTypes.Contains(p.PropertyType) &&
p.Name != "RelationshipManager" &&
!AllreadyAdded(p, obj)
select p;
foreach (var property in complexProperties)
{
var complexValue = TryGetValue(property, obj);
if(complexValue != null)
{
var js = new EFJavaScriptConverter(_maxDepth - _currentDepth, this);
result.Add(property.Name, js.Serialize(complexValue, new EFJavaScriptSerializer()));
}
}
}
return result;
}
private bool AllreadyAdded(PropertyInfo p, object obj)
{
var val = TryGetValue(p, obj);
return _processedObjects.Contains(val == null ? 0 : val.GetHashCode());
}
private static object TryGetValue(PropertyInfo p, object obj)
{
var parameters = p.GetIndexParameters();
if (parameters.Length == 0)
{
return p.GetValue(obj, null);
}
else
{
//cant serialize these
return null;
}
}
private static object TryGetStringValue(PropertyInfo p, object obj)
{
if (p.GetIndexParameters().Length == 0)
{
var val = p.GetValue(obj, null);
return val;
}
else
{
return string.Empty;
}
}
public override IEnumerable<Type> SupportedTypes
{
get
{
var types = new List<Type>();
//ef types
types.AddRange(Assembly.GetAssembly(typeof(DbContext)).GetTypes());
//model types
types.AddRange(Assembly.GetAssembly(typeof(BaseViewModel)).GetTypes());
return types;
}
}
}
You can now safely make a call like new EFJavaScriptSerializer().Serialize(obj)
Update : since version Telerik v1.3+ you can now override the GridActionAttribute.CreateActionResult method and hence you can easily integrate this Serializer into specific controller methods by applying your custom [GridAction] attribute:
[Grid]
public ActionResult _GetOrders(int id)
{
return new GridModel(Service.GetOrders(id));
}
and
public class GridAttribute : GridActionAttribute, IActionFilter
{
/// <summary>
/// Determines the depth that the serializer will traverse
/// </summary>
public int SerializationDepth { get; set; }
/// <summary>
/// Initializes a new instance of the <see cref="GridActionAttribute"/> class.
/// </summary>
public GridAttribute()
: base()
{
ActionParameterName = "command";
SerializationDepth = 1;
}
protected override ActionResult CreateActionResult(object model)
{
return new EFJsonResult
{
Data = model,
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
MaxSerializationDepth = SerializationDepth
};
}
}
and finally..
public class EFJsonResult : JsonResult
{
const string JsonRequest_GetNotAllowed = "This request has been blocked because sensitive information could be disclosed to third party web sites when this is used in a GET request. To allow GET requests, set JsonRequestBehavior to AllowGet.";
public EFJsonResult()
{
MaxJsonLength = 1024000000;
RecursionLimit = 10;
MaxSerializationDepth = 1;
}
public int MaxJsonLength { get; set; }
public int RecursionLimit { get; set; }
public int MaxSerializationDepth { get; set; }
public override void ExecuteResult(ControllerContext context)
{
if (context == null)
{
throw new ArgumentNullException("context");
}
if (JsonRequestBehavior == JsonRequestBehavior.DenyGet &&
String.Equals(context.HttpContext.Request.HttpMethod, "GET", StringComparison.OrdinalIgnoreCase))
{
throw new InvalidOperationException(JsonRequest_GetNotAllowed);
}
var response = context.HttpContext.Response;
if (!String.IsNullOrEmpty(ContentType))
{
response.ContentType = ContentType;
}
else
{
response.ContentType = "application/json";
}
if (ContentEncoding != null)
{
response.ContentEncoding = ContentEncoding;
}
if (Data != null)
{
var serializer = new JavaScriptSerializer
{
MaxJsonLength = MaxJsonLength,
RecursionLimit = RecursionLimit
};
serializer.RegisterConverters(new List<JavaScriptConverter> { new EFJsonConverter(MaxSerializationDepth) });
response.Write(serializer.Serialize(Data));
}
}
You can also detach the object from the context and it will remove the navigation properties so that it can be serialized. For my data repository classes that are used with Json i use something like this.
public DataModel.Page GetPage(Guid idPage, bool detach = false)
{
var results = from p in DataContext.Pages
where p.idPage == idPage
select p;
if (results.Count() == 0)
return null;
else
{
var result = results.First();
if (detach)
DataContext.Detach(result);
return result;
}
}
By default the returned object will have all of the complex/navigation properties, but by setting detach = true it will remove those properties and return the base object only. For a list of objects the implementation looks like this
public List<DataModel.Page> GetPageList(Guid idSite, bool detach = false)
{
var results = from p in DataContext.Pages
where p.idSite == idSite
select p;
if (results.Count() > 0)
{
if (detach)
{
List<DataModel.Page> retValue = new List<DataModel.Page>();
foreach (var result in results)
{
DataContext.Detach(result);
retValue.Add(result);
}
return retValue;
}
else
return results.ToList();
}
else
return new List<DataModel.Page>();
}
I have just successfully tested this code.
It may be that in your case your Message object is in a different assembly? The overriden Property SupportedTypes is returning everything ONLY in its own Assembly so when serialize is called the JavaScriptSerializer defaults to the standard JavaScriptConverter.
You should be able to verify this debugging.
Your error occured due to some "Reference" classes generated by EF for some entities with 1:1 relations and that the JavaScriptSerializer failed to serialize.
I've used a workaround by adding a new condition :
!p.Name.EndsWith("Reference")
The code to get the complex properties looks like this :
var complexProperties = from p in type.GetProperties()
where p.CanWrite &&
p.CanRead &&
!p.Name.EndsWith("Reference") &&
!_builtInTypes.Contains(p.PropertyType) &&
!_processedObjects.Contains(p.GetValue(obj, null)
== null
? 0
: p.GetValue(obj, null).GetHashCode())
select p;
Hope this help you.
I had a similar problem with pushing my view via Ajax to UI components.
I also found and tried to use that code sample you provided. Some problems I had with that code:
SupportedTypes wasn't grabbing the types I needed, so the converter wasn't being called
If the maximum depth is hit, the serialization would be truncated
It threw out any other converters I had on the existing serializer by creating its own new JavaScriptSerializer
Here are the fixes I implemented for those issues:
Reusing the same serializer
I simply reused the existing serializer that is passed into Serialize to solve this problem. This broke the depth hack though.
Truncating on already-visited, rather than on depth
Instead of truncating on depth, I created a HashSet<object> of already seen instances (with a custom IEqualityComparer that checked reference equality). I simply didn't recurse if I found an instance I'd already seen. This is the same detection mechanism built into the JavaScriptSerializer itself, so worked quite well.
The only problem with this solution is that the serialization output isn't very deterministic. The order of truncation is strongly dependent on the order that reflections finds the properties. You could solve this (with a perf hit) by sorting before recursing.
SupportedTypes needed the right types
My JavaScriptConverter couldn't live in the same assembly as my model. If you plan to reuse this converter code, you'll probably run into the same problem.
To solve this I had to pre-traverse the object tree, keeping a HashSet<Type> of already seen types (to avoid my own infinite recursion), and pass that to the JavaScriptConverter before registering it.
Looking back on my solution, I would now use code generation templates to create a list of the entity types. This would be much more foolproof (it uses simple iteration), and have much better perf since it would produce a list at compile time. I'd still pass this to the converter so it could be reused between models.
My final solution
I threw out that code and tried again :)
I simply wrote code to project onto new types ("ViewModel" types - in your case, it would be service contract types) before doing my serialization. The intention of my code was made more explicit, it allowed me to serialize just the data I wanted, and it didn't have the potential of slipping in queries on accident (e.g. serializing my whole DB).
My types were fairly simple, and I didn't need most of them for my view. I might look into AutoMapper to do some of this projection in the future.