Rationally splitting the Entity structure into multiple tables

I am using Entity Framework 6 Code First to store a large POCO model in a database. The model has 1000 properties (don't ask lol **), which means it needs to be split into multiple SQL tables (column limit is 1024). I know this is usually done by specifying individual columns like this:

modelBuilder.Entity<HugeEntity>.Map(m =>
{
    m.Properties(e => new { e.Prop1, e.Prop2 });
    m.ToTable("HugeEntity1");
}.Map(m =>
{
    m.Properties(e => new { e.Prop3, e.Prop4 });
    m.ToTable("HugeEntity2");
};

      

I'm wondering if there is a way to do this without specifying the properties separately. Ideally, it can automatically split the object based on a given column limit (e.g. 1000).

Even if there is no standard way, what is the easiest hack to get this to work? The properties on the model are subject to change, so I would really like them not to list them exhaustively in several places.

Any advice is appreciated!

** CONTEXT. It is a domain model representing user-entered data to be recorded on a specific web page. It also appears in WebAPI. My team considered a key / cost pair approach, but decided it would make it harder for future BI applications to hit the WebAPI to consume data.

+3


source to share


2 answers


Figured out a way to do it. I had to use Linq expressions and the "dynamic" keyword:



    private static void SplitIntoTables<T>(DbModelBuilder modelBuilder, IReadOnlyCollection<PropertyInfo> properties, int columnLimit) where T : class
    {
        var numberOfTables = Math.Ceiling((properties.Count + (double)columnLimit / 2) / columnLimit);
        var paramExp = Expression.Parameter(typeof(T));

        var tableIndex = 0;
        foreach (var tableGroup in properties.GroupBy(p => p.Name.GetHashCode() % numberOfTables))
        {
            var expressions = tableGroup.Select(p => Expression.Lambda(
                typeof(Func<,>).MakeGenericType(typeof(T), p.PropertyType),
                Expression.Property(paramExp, p), paramExp));

            modelBuilder.Entity<T>().Map(m =>
            {
                foreach (var exp in expressions)
                {
                    m.Property((dynamic) exp);
                }
                m.ToTable($"{typeof(T).Name}_{++tableIndex}");
            });
        }
    }

      

+1


source


You say don't ask us. But your biggest problem is that you are saying this to yourself. If your model receives more than 50 fields, you need to ask what's going on. I wonder if it's worth taking a breath and revisiting some of the more abstract concepts in computing. I would start with Database Normalization . 1K dynamic properties tell me you desperately need this.

And by the way, the concepts of "database normalization" are not specific to SQL databases per se. You should normalize your POCO models as much as you can. Of course, there are some non-relational concepts in OO languages. But there is no excuse for the extremes that you describe.



Domain Driven Design could be another paradigm . I am less fluent in this, so I am not saying that you should start there. But from the very beginning that I do in implementation, I would say the learning curve is worth it.

I must be careful not to condescend. I definitely don't have all of my tables in their highest normal forms. But I will tell you that in areas that I do not have, the headaches are worse.

+2


source







All Articles