So I started to wonder if it was possible to create class that implements ITableEntity and offer the dynamic features of an ExpandoObject. After a bit of hacking around in LinqPad I have this solution.
In this snippet I also implemented the ICustomMemberProvider which is part of the LinqPad extensions API for queries (more on this here). In Visual Studio we'll need to remove that code.
Please note that you need to use the dynamic keyword to be able to define properties dynamically. You can also use the entity indexer like I did with the LastName property.
Result
List<ElasticTableEntity> (1 item)
PartitionKey
RowKey
Timestamp
ETag
FirstName
Number
Bool
Date
TokenId
LastName
Partition123
2520391787589766073
2013-03-13 1:00:40 AM +00:00
W/"datetime'2013-03-13T01%3A00%3A40.619873Z'"
Pascal
34
False
1912-03-04 12:00:00 AM +00:00
50604c02-f01c-48fc-862e-7ea66153f434
Laurin
Result with projection
List<ElasticTableEntity> (1 item)
PartitionKey
RowKey
Timestamp
ETag
Date
FirstName
Partition123
2520391787589766073
2013-03-13 1:00:40 AM +00:00
W/"datetime'2013-03-13T01%3A00%3A40.619873Z'"
1912-03-04 12:00:00 AM +00:00
Pascal
The ElasticTableEntity allows us to define properties at run time which will be added to the table when inserting the entities. Tables in the Azure Table Storage have flexible schema so we are free to store entities with different properties as long a we respect some limitations:
Entities can have no more than 252 different properties (that's for the Table)
An Entity's data can be up to 1 MB in size
A property must be one of the following types : byte[], bool, DateTime, double, Guid, int, long or string
A property value can be up to 64 KB in size (for string and byte array)
A property name is case sensitive and can be no more than 255 characters in length
You can store about any kind of data as long as it is one of the supported data type. You could also encode other kind of date type in a byte array or a string (like a json document). Just be careful to always stick to one data type for a property (yes, we can store like int, bool and string in the same column using different entities!)
Just to let you know, I've implemented this on an Azure solution I've created (internal data tracking, nothing glamorous), and it works great. Something I added to the getter & setter is a cleaninput(key) method to strip out bad characters - Azure doesn't like Spaces, etc in Table column names. Maybe not the best place for it, but I don't have the opportunity to police the input all the time.
Thanks for the code share, though - helped a lot! -Dan
Any ideas on Serialization? I added this: [KnownType(typeof(ElasticTableEntity))] [DataContract] public class Root { [DataMember] public string Name { get; set; }
[DataMember] public dynamic Properties { get; set; } }
[Serializable] [DataServiceKey("PartitionKey", "RowKey")] public class ElasticTableEntity : DynamicObject, ITableEntity, ISerializable { public static Boolean debug = false;
public ElasticTableEntity() { this.Properties = new Dictionary(); }
public IDictionary Properties { get; private set; }
... public void GetObjectData(SerializationInfo info, StreamingContext context) { foreach (var kvp in Properties) { info.AddValue(kvp.Key, kvp.Value); } }
@Dan are you trying to use ElasticTableEntity in a web service (or data service)? I usually don't re-use my *TableEntities for anything else than persistence in Table Storage. I never tried to serialize ElasticTableEntity but I suggest you manually handle serialization with attributes like [NonSerialized] and having explicte backing fields instead of auto-properties (especially for Dictionnaries which don't serialize well on their own IIRC).
I fought it for a few hours last night - the regular XML serializer won't work with IDictionaries, then I went the [DataContract] route with WCF serialization, and that crashed and burned and gave up - it wasn't a requirement, but a "want" for my project. I was trying to serialize an object to put it on the Message Queue for easy handing off to a WorkerRole. Blech. :-) Basically, DynamicEntities Don't Serialize, and I don't really need it to work.
I do, however, still like the concept for Table storage. It's nice. :-)
I am working on a NLog plugin for Azure Table Storage and was hoping to use your ElasticTableEntity. I'd add attribution at the top of the file for you. Let me know if you have any issues or want to handle things differently.
Removed the last post, clearly insufficient coffee and not enough sleep.
What I have been playing around with is extending this example to allow for it to support reading and writing simple POCO classes (with some additional mappings for things like enums). If anyone wants to assist, I can spin up a GitHub project and we can get cracking.
Adam I just finished adding enum support (stored as string values in table storage), set all ElasticTableEntity properties from public properties of a object, and create a new object setting its public props from ElasticTableEntity.
14 comments:
Great work!
Great work!
Just to let you know, I've implemented this on an Azure solution I've created (internal data tracking, nothing glamorous), and it works great. Something I added to the getter & setter is a cleaninput(key) method to strip out bad characters - Azure doesn't like Spaces, etc in Table column names. Maybe not the best place for it, but I don't have the opportunity to police the input all the time.
Thanks for the code share, though - helped a lot!
-Dan
Any ideas on Serialization? I added this:
[KnownType(typeof(ElasticTableEntity))]
[DataContract]
public class Root
{
[DataMember]
public string Name { get; set; }
[DataMember]
public dynamic Properties { get; set; }
}
[Serializable]
[DataServiceKey("PartitionKey", "RowKey")]
public class ElasticTableEntity : DynamicObject, ITableEntity, ISerializable
{
public static Boolean debug = false;
public ElasticTableEntity()
{
this.Properties = new Dictionary();
}
public IDictionary Properties { get; private set; }
...
public void GetObjectData(SerializationInfo info, StreamingContext context)
{
foreach (var kvp in Properties)
{
info.AddValue(kvp.Key, kvp.Value);
}
}
But to no avail, still failing. Ideas?
@Dan are you trying to use ElasticTableEntity in a web service (or data service)? I usually don't re-use my *TableEntities for anything else than persistence in Table Storage. I never tried to serialize ElasticTableEntity but I suggest you manually handle serialization with attributes like [NonSerialized] and having explicte backing fields instead of auto-properties (especially for Dictionnaries which don't serialize well on their own IIRC).
I fought it for a few hours last night - the regular XML serializer won't work with IDictionaries, then I went the [DataContract] route with WCF serialization, and that crashed and burned and gave up - it wasn't a requirement, but a "want" for my project. I was trying to serialize an object to put it on the Message Queue for easy handing off to a WorkerRole. Blech. :-) Basically, DynamicEntities Don't Serialize, and I don't really need it to work.
I do, however, still like the concept for Table storage. It's nice. :-)
I am working on a NLog plugin for Azure Table Storage and was hoping to use your ElasticTableEntity. I'd add attribution at the top of the file for you. Let me know if you have any issues or want to handle things differently.
@Dan go ahead, I don't mind. Just mention this post it the code, especially if it's open source.
Very useful. Thanks man!
Removed the last post, clearly insufficient coffee and not enough sleep.
What I have been playing around with is extending this example to allow for it to support reading and writing simple POCO classes (with some additional mappings for things like enums). If anyone wants to assist, I can spin up a GitHub project and we can get cracking.
.//Adam
Adam I just finished adding enum support (stored as string values in table storage), set all ElasticTableEntity properties from public properties of a object, and create a new object setting its public props from ElasticTableEntity.
Did you ever create that github?
Azure Storage SDK > v8.0.0 solved the problem for writing complex objects into Table Storage. Please have a look at:
https://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.table.tableentity.flatten.aspx
https://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.table.tableentity.convertback.aspx
And the article I wrote about this before I checked the api s into the SDK:
https://doguarslan.wordpress.com/2016/02/03/writing-complex-objects-to-azure-table-storage/
Post a Comment