Is there a high performance way to replace the BinaryFormatter in .NET5? - binaryformatter

Before .NET5 we serialize/deserialize the Bytes/Object by these code:
private static byte[] StructToBytes<T>(T t)
{
using (var ms = new MemoryStream())
{
var bf = new BinaryFormatter();
bf.Serialize(ms, t);
return ms.ToArray();
}
}
private static T BytesToStruct<T>(byte[] bytes)
{
using (var memStream = new MemoryStream())
{
var binForm = new BinaryFormatter();
memStream.Write(bytes, 0, bytes.Length);
memStream.Seek(0, SeekOrigin.Begin);
var obj = binForm.Deserialize(memStream);
return (T)obj;
}
}
But the BinaryFormatter will be removed for the security reason:
https://learn.microsoft.com/en-us/dotnet/standard/serialization/binaryformatter-security-guide
So is there some simple but high performance method to replace BinaryFormatter?

In my project, which we recently migrated from .NET Core 3.1 to .NET 5, I swapped out our BinarySerializer code with Protobuf-net: https://github.com/protobuf-net/protobuf-net
The code was almost exactly the same, and the project is very reputable with (currently) 22 million downloads and 3.2k stars on GitHub. It is very fast and has none of the security baggage surrounding BinarySerializer.
Here's my class for byte[] serialization:
public static class Binary
{
/// <summary>
/// Convert an object to a Byte Array, using Protobuf.
/// </summary>
public static byte[] ObjectToByteArray(object obj)
{
if (obj == null)
return null;
using var stream = new MemoryStream();
Serializer.Serialize(stream, obj);
return stream.ToArray();
}
/// <summary>
/// Convert a byte array to an Object of T, using Protobuf.
/// </summary>
public static T ByteArrayToObject<T>(byte[] arrBytes)
{
using var stream = new MemoryStream();
// Ensure that our stream is at the beginning.
stream.Write(arrBytes, 0, arrBytes.Length);
stream.Seek(0, SeekOrigin.Begin);
return Serializer.Deserialize<T>(stream);
}
}
I did have to add attributes to the class I serialized. It was decorated with [Serializable] only, and although I understand Protobuf can work with a lot of common decorations, that one didn't work. From the example on github:
[ProtoContract]
class Person {
[ProtoMember(1)]
public int Id {get;set;}
[ProtoMember(2)]
public string Name {get;set;}
[ProtoMember(3)]
public Address Address {get;set;}
}
[ProtoContract]
class Address {
[ProtoMember(1)]
public string Line1 {get;set;}
[ProtoMember(2)]
public string Line2 {get;set;}
}
In my case I am caching things in Redis, and it worked great.
It is also possible to re-enable this, in your .csproject file:
<PropertyGroup>
<TargetFramework>net5.0</TargetFramework>
<EnableUnsafeBinaryFormatterSerialization>true</EnableUnsafeBinaryFormatterSerialization>
</PropertyGroup>
...But it's a bad idea. BinaryFormatter is responsible for many of .NET's historical vulnerabilities, and it can't be fixed. It will likely become completely unavailable in future versions of .NET, so replacing it is the right move.

if you are using .NET Core 5 or greater, you can use the new System.Text.Json.JsonSerializer.Serialize and System.Text.Json.JsonSerializer.Deserialize like so:
public static class Binary
{
/// <summary>
/// Convert an object to a Byte Array.
/// </summary>
public static byte[] ObjectToByteArray(object objData)
{
if (objData == null)
return default;
return Encoding.UTF8.GetBytes(JsonSerializer.Serialize(objData, GetJsonSerializerOptions()));
}
/// <summary>
/// Convert a byte array to an Object of T.
/// </summary>
public static T ByteArrayToObject<T>(byte[] byteArray)
{
if (byteArray == null || !byteArray.Any())
return default;
return JsonSerializer.Deserialize<T>(byteArray, GetJsonSerializerOptions());
}
private static JsonSerializerOptions GetJsonSerializerOptions()
{
return new JsonSerializerOptions()
{
PropertyNamingPolicy = null,
WriteIndented = true,
AllowTrailingCommas = true,
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
};
}
}

While an old thread, it's still relevant, especially if you find yourself dealing with code storing .NET data in Memcached for example (or Redis, or secondary storage on-prem or in a cloud). BinaryFormatter has the security problems mentioned in the OP, and also has performance and size issues.
A great alternative is the MessagePack format, and more specifically the MessagePack NuGet package for .NET solutions.
It's secure, maintained, faster, and smaller all around. See the benchmarks for details.
ZeroFormatter also appears to be a great alternative.
In today's cloud-centric solutions where sizing and capacity are important for lowering costs, these are extremely helpful.

There is an option to use it in .NET Core 5:
Just add
<EnableUnsafeBinaryFormatterSerialization>true</EnableUnsafeBinaryFormatterSerialization>
To the project like:
<PropertyGroup>
<TargetFramework>net5.0</TargetFramework>
<EnableUnsafeBinaryFormatterSerialization>true</EnableUnsafeBinaryFormatterSerialization>
</PropertyGroup>
I believe it will work.

Related

Use EF Core to get a list of Longs

I have a stored procedure that returns me a list of IDs. (I then use this list of IDs as keys for objects.)
I am migrating this from .NET to .NET Core. In normal .NET I could use an extension library to get the numbers out like this:
var getOrderDetailIdsStoredProc = new GetOrderDetailIdsStoredProc()
{
NumberOfOrderDetailIdsNeeded = numberOfOrderDetailIdsNeeded
};
var orderDetailIds = contextProvider.Context.Database
.ExecuteStoredProcedure<long>(getOrderDetailIdsStoredProc);
But that library (EntityFrameworkExtras) is not working with EF Core (I found a version for EF Core, but it doesn't work.)
So I have been looking for other solutions:
DbContext.Database.ExecuteSqlCommand: Cannot return records, only output variables
DbSet.FromSQL: Can only be run on a DbSet<T> (basically it needs an entity type)
Right now, all I can think of is to make an entity called Number:
public class Number
{
public long Value;
}
public DbSet<Number> Numbers;
And then do something like this:
Numbers.FromSql("exec GenerateOrderDetailSequencedIds #numberNeeded", numberNeeded)
Aside from the fact that this is very ugly (making an entity out of a native type), I have no table to hook it up to, so I worry it will not work.
Is there any way in EF Core to run a stored procedure and get back a list of numbers?
NOTE: This worked, but was not compatable with BreezeJs (it could not deal with a DbQuery). See my other answer for what I ended up doing.
OrderDetailIdHolder.cs
public class OrderDetailIdHolder
{
public long NewId { get; set; }
}
MyEntitiesContext
public DbQuery<OrderDetailIdHolder> OrderDetailIdHolders { get; set; }
internal List<long> GetOrderDetailIds(int numberOfIdsNeeded)
{
var result = OrderDetailIdHolders.FromSql($"exec Sales.GenerateOrderDetailIds {numberOfIdsNeeded}").ToList();
return result.Select(x=>x.NewId).ToList();
}
This a bit extra complexity for just a list of longs. But it works.
It is important to note that the property (NewId in this case) must match what is returned from the sproc. Also, the type is not a DbSet. It is a DbQuery.
It is also important to note that this is only for EF Core 2.2. EF Core 3 has a different way to do this (Keyless Entity Types)
This is what I ended up using:
public static List<T> SqlQueryList<T>(this DatabaseFacade database, string query, params SqlParameter[] sqlParameters)
{
// TODO: Add a using statement here so we don't leak the connection's resources.
var conn = database.GetDbConnection();
conn.Open();
var command = conn.CreateCommand();
command.CommandText = query;
command.Parameters.AddRange(sqlParameters);
var reader = command.ExecuteReader();
List<T> result = new List<T>();
while (reader.Read())
{
T typedRow;
var row = reader.GetValue(0);
if (typeof(T).IsValueType)
{
typedRow = (T) row;
}
else
{
typedRow = (T)Convert.ChangeType(result, typeof(T));
}
result.Add(typedRow);
}
return result;
}
Called like this:
var numberOfOrderDetailIdsNeededParam = new SqlParameter
{
ParameterName = "#numberOfOrderDetailIdsNeeded",
SqlDbType = SqlDbType.Int,
Direction = ParameterDirection.Input
};
numberOfOrderDetailIdsNeededParam.Value = numberOfOrderDetailIdsNeeded;
var result = contextProvider.Context.Database.SqlQueryList<long>($"exec Sales.GenerateOrderDetailIds #numberOfOrderDetailIdsNeeded", numberOfOrderDetailIdsNeededParam);
I did it this way because it was compatible with BreezeJs for .NET Core. Note that I only really tested this with Value types.

Way to deserialized or load last save state scene with GameObject and other classes instead of storing normal fields via binary format?

I humbly ask for help in order to solve this problem. I successfully take a quick guide to learn more about save and load state of the game. Basically first, I know how to use PlayerPrefsto store basic string, int, and float.
Now, looking for a more effective way to store saved files via serialization. At first, after my first few researches online, I watch some video tutorial and only provides storing basic fields (int, string, bool, float, etc.) and saved in a created file. I attempt to try it on classes but didn't worked unless I marked as [Serializable].
Next, trying to save Gameobject created, prefabs or not, didn't work and it requires to serialize that class itself that is "GameObject". I took first attempt using this guide from StackOverflow with the accepted answer, I do understand and saving a GameObject or other custom classes require to store it and converted into .xml file.
Here's my two main problem need to resolve. the first one is that the runtime returned NullPointerException after I ensured all of the necessary objects are created as new. The error stopped at this line (ask for more code source if you need):
DataContractSerializer ds = new DataContractSerializer (data2.GetType ()); // --> Serialize to .xml file.
MemoryStream stream = new MemoryStream();
ds.WriteObject (stream, data2); // --> The error stops here.
stream.Seek (0, SeekOrigin.Begin);
file.Write (stream.GetBuffer (), 0, stream.GetBuffer ().Length);
file.Close();
As you can see, this is the part of the code where you can save and stored in a file created on a persistent file directory for the stored classes such as GameObject, List, and/or other custom classes.
The second problem will be tackled after the first problem is resolved. The second problem involves loading last state saved. For the normal fields such as integer or string, using BinaryFormatter and FileStream works well to load stored values form a file created last time. I tried that on stored custom classes such as GameObject but it required a different method like this one but it is a bit hard to understand how to translate it in Unity and still observe some ways to work it out, the best way to load stored classes from a file.
Here is the class I'm trying to deserialize a class that contains the following fields inside.
[DataContract]
public class TreeData2 {
// - - - Spouse - - -
[DataMember] private List<GameObject> _masters;
public List<GameObject> masters {
get { return _masters; }
set { _masters = value; }
}
[DataMember] private List<GameObject> _targets;
public List<GameObject> targets {
get { return _targets; }
set { _targets = value; }
}
[DataMember] private List<FamilyDatabase> _familyGroup;
public List<FamilyDatabase> familyGroup {
get { return _familyGroup; }
set { _familyGroup = value; }
}
[DataMember] private GameObject _node;
public GameObject node {
get { return _node; }
set { _node = value; }
}
[DataMember] private List<string> _mothers;
public List<string> mothers {
get { return _mothers; }
set { _mothers = value; }
}
[DataMember] private List<string> _fathers;
public List<string> fathers {
get { return _fathers; }
set { _fathers = value; }
}
[DataMember] private List<GenerationDatabase> _genDb;
public List<GenerationDatabase> genDb {
get { return _genDb; }
set { _genDb = value; }
}
// - - - Root Action - - -
[DataMember] private List<GameObject> _child;
public List<GameObject> child {
get { return _child; }
set { _child = value; }
}
// Gen Database (Main)
[DataMember] private List<string> _mothersDB;
public List<string> mothersDB {
get { return _mothersDB; }
set { _mothersDB = value; }
}
[DataMember] private List<string> _fathersDB;
public List<string> fathersDB {
get { return _fathersDB; }
set { _fathersDB = value; }
}
[DataMember] private List<GameObject> _mastersDB;
public List<GameObject> mastersDB {
get { return _mastersDB; }
set { _mastersDB = value; }
}
[DataMember] private List<GameObject> _targetsDB;
public List<GameObject> targetsDB {
get { return _targetsDB; }
set { _targetsDB = value; }
}
[DataMember] private List<string> _mothersT;
public List<string> mothersT {
get { return _mothersT; }
set { _mothersT = value; }
}
[DataMember] private List<string> _fathersT;
public List<string> fathersT {
get { return _fathersT; }
set { _fathersT = value; }
}
[DataMember] private List<GameObject> _mastersT;
public List<GameObject> mastersT {
get { return _mastersT; }
set { _mastersT = value; }
}
[DataMember] private List<GameObject> _targetsT;
public List<GameObject> targetsT {
get { return _targetsT; }
set { _targetsT = value; }
}
}
data2 is the variable name of the TreeData2 class and yes I'm making a family tree like structure via Unity for the game that shows progress of unlocking and storing lists of the branches. Here's the recap with the mentioned variable name while serializing GameObject and List classes.
FileStream file = File.Create (Application.persistentDataPath + "/check/treeData2.dat");
TreeData2 data2 = new TreeData2();
. . .
DataContractSerializer ds = new DataContractSerializer (data2.GetType ());
MemoryStream stream = new MemoryStream();
ds.WriteObject (stream, data2); // --> Error stops here. Returns NullPointerException due to failed in parsing in .xml file in storing GameObject classes and List<T>.
stream.Seek (0, SeekOrigin.Begin);
file.Write (stream.GetBuffer (), 0, stream.GetBuffer ().Length);
file.Close();
string result = XElement.Parse(Encoding.ASCII.GetString(stream.GetBuffer()).Replace("\0", "")).ToString();
print ("SAVE TREE COMPLETE");
print ("Result: " + result);
Redirect from this original question from Game Development.
Let me start with clearing a confusion I find so often. 'Serializable' will serialize a class for display in inspector only. You can say that it will serialize your class into a format that can be represented by inspector. Often, serialize means convert into byte arrays, it doesn't have to be a byte array, it can be any other format, e.g. strings.
Unity does not allow you to save your gameobjects directly, however, it doesn't stop you from recreating them to reflect some previous state. A gameobject may have so many components, e.g. MeshRenderer, Camera, etc, and it could get rough serializing all that down, imagine some big hierarchy!.
Generally, you want to separate your behaviors from data, i.e. your MonoBehavior from model/data (not 3d model), its a common OOP practice. You can come up with how you can save and retrieve some data from disk. I used Json.Net to serialize my data models into json. That gave me flexibility to save it locally as well as send it over network, but its just an example.
For an example, if I had to serialize a user's inventory, I would make a class like this:
[Serializable]
public class Inventory
{
public List<InventoryItem> _items;
// Some serializable and transient variables
}
[Serializable]
public class InventoryItem
{
// Some serializable and transient variables
}
Then my Monobehaviour that would show inventory would look like this:
public class InventoryView : MonoBehaviour
{
public Inventory _inventory; //Now the inventory items show in the inspector also, because it is serialized.
void createViewFromInventory()
{ //... }
}
And now, anywhere I have reference to an Inventory object, I can serialize it however I wish to do so. With Json.Net, all I had to do looked something like this:
Inventory inv = getInventoryRef(); // get inventory here
string serializedInv = JsonConvert.Serialize(inv); // It converts it into json.
PlayerPrefs.Save("inventory", serializedInv); //
And in order to retrieve the saved inventory, I would do something like this:
void loadInventory()
{
string invStr = PreferPrefs.GetString("inventory");
Inventory inv = JsonConvert.Deserialize<Inventory>(invStr);
}
This way, monobehavior view classes need only ACT upon inventory state, but not save other components on the gameobject. Hope it helps.
Unfortunately, after several experiments to find the easy way to store GameObjects effectively without hassle, there is no shortcut in preserving files + game objects during save and load state. I decided to give up for banging my head too hard to go crazy in making expectations on saving scene in one go. There are many steps to understand it.
Right now, storing info on each game object's specified sections/components require complex serializable classes, a meticulous way for sharing and retrieving each part of data. Explanation on this link and for more effective way in preserving data (but still not on game object) via formatting into JSON file.
It is a wise choice to make a better roadmap for tracking last state from scratch if you want to make your own save/load state for both common variables, serializable classes, and game object.
However, there is another way to save and load GameObject using this solution found on this Unity Q&A section. It discusses about "SerializerHelper" that lets you serialize not only serialized classes but also game objects, scenes and other non-serializable classes. Check this forum as well in order to understand how it works. You can try this Unity package to try out saving/loading GO here to download. (Requires to register Dropbox account if needed.)

Are there any patterns for component versioning and backwards-compatibility using Windsor?

I have to support a new input file format in a system which uses Windsor. I also need to support the old version of the input file during a transition phase.
This will probably be repeated in future, and we'll again need to support the new and the next most recent format.
The import processing is handled by a component, and the new version has had significant improvements in the code which makes it lots more efficient compared to the old version. So what I'd like to do is to have the new component and the old component in the system, and dynamically use the new or the old component based upon the file metadata.
Is there a pattern for this type of scenario anyone can suggest?
The fact that you're using Windsor is pretty much irrelevant here. Always strive to find a container-independent solution. Here's one:
interface IImportProcessor {
bool CanHandleVersion(int version);
Stream Import(Stream input);
}
class ImportProcessorVersion1 : IImportProcessor {
public bool CanHandleVersion(int version) {
return version == 1;
}
public Stream Import(Stream input) {
// do stuff
return input;
}
}
class ImportProcessorVersion2 : IImportProcessor {
public bool CanHandleVersion(int version) {
return version == 2;
}
public Stream Import(Stream input) {
// do stuff
return input;
}
}
class MainImportProcessor: IImportProcessor {
private readonly IImportProcessor[] versionSpecificProcessors;
public MainImportProcessor(IImportProcessor[] versionSpecificProcessors) {
this.versionSpecificProcessors = versionSpecificProcessors;
}
public bool CanHandleVersion(int version) {
return versionSpecificProcessors.Any(p => p.CanHandleVersion(version));
}
private int FetchVersion(Stream input) {
// do stuff
return 1;
}
public Stream Import(Stream input) {
int version = FetchVersion(input);
var processor = versionSpecificProcessors.FirstOrDefault(p => p.CanHandleVersion(version));
if (processor == null)
throw new Exception("Unsupported version " + version);
return processor.Import(input);
}
}
Your app would take a dependency on IImportProcessor. The container is wired so that the default implementation of this interface is MainImportProcessor. The container is also wired so that MainImportProcessor gets all other implementations of IImportProcessor.
This way you can add implementations of IImportProcessor and each will be selected when appropriate.
It might be easier to wire things up if MainImportProcessor implements an interface different from IImportProcessor.
Another possibility could be implementing a chain of responsibility.

Need help loading XML data into XNA 4.0 project

I'd like to do this the right way if possible. I have XML data as follows:
<?xml version="1.0" encoding="utf-8"?>
<XnaContent>
<Asset Type="PG2.Dictionary">
<Letters TotalInstances="460100">
<Letter Count="34481">a</Letter>
...
<Letter Count="1361">z</Letter>
</Letters>
<Words Count="60516">
<Word>aardvark</Word>
...
<Word>zebra</Word>
</Words>
</Asset>
</XnaContent>
and I'd like to load this in (using Content.Load< Dictionary >) into one of these
namespace PG2
{
public class Dictionary
{
public class Letters
{
public int totalInstances;
public List<Character> characters;
public class Character
{
public int count;
public char character;
}
}
public class Words
{
public int count;
public HashSet<string> words;
}
Letters letters;
Words words;
}
}
Can anyone help with either instructions or pointers to tutorials? I've found a few which come close but things seem to have changed slightly between 3.1 and 4.0 in ways which I don't understand and a lot of the documentation assumes knowledge I don't have. My understanding so far is that I need to make the Dictionary class Serializable but I can't seem to make that happen. I've added the XML file to the content project but how do I get it to create the correct XNB file?
Thanks!
Charlie.
This may help Link. I found it useful to work the other way round to check that my xml data was correctly defined. Instantate your dictionary class set all the fields then serialize it to xml using a XmlSerializer to check the output.
You need to implement a ContentTypeSerializer for your Dictionary class. Put this in a content extension library and add a reference to the content extension library to your content project. Put your Dictionary class into a game library that is reference by both your game and the content extension project.
See:
http://blogs.msdn.com/b/shawnhar/archive/2008/08/26/customizing-intermediateserializer-part-2.aspx
Here is a quick ContentTypeSerializer I wrote that will deserialize your Dictionary class. It could use better error handling.
using System;
using System.Collections.Generic;
using System.Xml;
using Microsoft.Xna.Framework.Content.Pipeline.Serialization.Intermediate;
namespace PG2
{
[ContentTypeSerializer]
class DictionaryXmlSerializer : ContentTypeSerializer<Dictionary>
{
private void ReadToNextElement(XmlReader reader)
{
reader.Read();
while (reader.NodeType != System.Xml.XmlNodeType.Element)
{
if (!reader.Read())
{
return;
}
}
}
private void ReadToEndElement(XmlReader reader)
{
reader.Read();
while (reader.NodeType != System.Xml.XmlNodeType.EndElement)
{
reader.Read();
}
}
private int ReadAttributeInt(XmlReader reader, string attributeName)
{
reader.MoveToAttribute(attributeName);
return int.Parse(reader.Value);
}
protected override Dictionary Deserialize(IntermediateReader input, Microsoft.Xna.Framework.Content.ContentSerializerAttribute format, Dictionary existingInstance)
{
Dictionary dictionary = new Dictionary();
dictionary.letters = new Dictionary.Letters();
dictionary.letters.characters = new List<Dictionary.Letters.Character>();
dictionary.words = new Dictionary.Words();
dictionary.words.words = new HashSet<string>();
ReadToNextElement(input.Xml);
dictionary.letters.totalInstances = ReadAttributeInt(input.Xml, "TotalInstances");
ReadToNextElement(input.Xml);
while (input.Xml.Name == "Letter")
{
Dictionary.Letters.Character character = new Dictionary.Letters.Character();
character.count = ReadAttributeInt(input.Xml, "Count");
input.Xml.Read();
character.character = input.Xml.Value[0];
dictionary.letters.characters.Add(character);
ReadToNextElement(input.Xml);
}
dictionary.words.count = ReadAttributeInt(input.Xml, "Count");
for (int i = 0; i < dictionary.words.count; i++)
{
ReadToNextElement(input.Xml);
input.Xml.Read();
dictionary.words.words.Add(input.Xml.Value);
ReadToEndElement(input.Xml);
}
ReadToEndElement(input.Xml); // read to the end of words
ReadToEndElement(input.Xml); // read to the end of asset
return dictionary;
}
protected override void Serialize(IntermediateWriter output, Dictionary value, Microsoft.Xna.Framework.Content.ContentSerializerAttribute format)
{
throw new NotImplementedException();
}
}
}

MVC 2, IModelBinder & ValueProvider changes

I'm trying to migrate to ASP.Net MVC 2 and meet some issues.
Here is one :
I needed to bind directly a Dictionary as result of a view post.
In ASP.Net MVC 1 it worked perfectly using a custom IModelBinder :
/// <summary>
/// Bind Dictionary<int, int>
///
/// convention : <elm name="modelName_key" value="value"></elm>
/// </summary>
public class DictionaryModelBinder : IModelBinder
{
#region IModelBinder Members
/// <summary>
/// Mandatory
/// </summary>
public object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
IDictionary<int, int> retour = new Dictionary<int, int>();
// get the values
var values = bindingContext.ValueProvider;
// get the model name
string modelname = bindingContext.ModelName + '_';
int skip = modelname.Length;
// loop on the keys
foreach(string keyStr in values.Keys)
{
// if an element has been identified
if(keyStr.StartsWith(modelname))
{
// get that key
int key;
if(Int32.TryParse(keyStr.Substring(skip), out key))
{
int value;
if(Int32.TryParse(values[keyStr].AttemptedValue, out value))
retour.Add(key, value);
}
}
}
return retour;
}
#endregion
}
It worked in pair with some smart HtmlBuilder that displayed dictionary of data.
The problem I meet now is that ValueProvider is not a Dictionary<> anymore, it's a IValueProvider that only allow to get values whose name is known
public interface IValueProvider
{
bool ContainsPrefix(string prefix);
ValueProviderResult GetValue(string key);
}
This is really not cool as I cannot perform my smart parsing...
Question :
Is there another way to get all keys ?
Do you know another way to bind a collection of HTML elements to a Dictionary
Thanks for your suggestions
O.
Though this question has been marked 'answered' I think the following may be helpful.
I had the same problem and had a look at the source code of the System.Web.Mvc.DefaultValueProvider. It gets its values from the RouteData, the query string or from a request form submission (in that exact order). To collect all the keys (which is what you ask for in your first question) I wrote the following helper method.
private static IEnumerable<string> GetKeys(ControllerContext context)
{
List<string> keys = new List<string>();
HttpRequestBase request = context.HttpContext.Request;
keys.AddRange(((IDictionary<string,
object>)context.RouteData.Values).Keys.Cast<string>());
keys.AddRange(request.QueryString.Keys.Cast<string>());
keys.AddRange(request.Form.Keys.Cast<string>());
return keys;
}
You can use this method to enumerate over the keys:
foreach (string key in GetKeys(controllerContext))
{
// Do something with the key value.
}
I don't think you'll be able to do it this way anymore in MVC 2.
Alternatively, you could extend DefaultModelBinder and override one of its virtual methods like GetModelProperties and then change the ModelName inside the ModelBindingContext. Another option would be to implement a custom MetadataProvider for your Dictionary type, you can change the model name there as well.