Alternative Approach for Updating Entities

Jul 19, 2009 at 12:57 PM

I was curious if anyone had thought of an alternative approach for updating entities?  The examples in the book and the source code show that updating the entities of the aggregate root involve deleting all of the entities and then inserting them again.  For reference, look at the CompanyRepository implementation.

While this approach does work, there are two problems with it.  First and foremost would be that the size of the IList<> containing any collection of entities could start impacting performance.  The churn involved in removing and the inserting what could likely be 99% of the same entities seems wasteful.  The second would be, looking at it from a generalist standpoint, it just feels awkward to not actually know what changes are pending.

I have been scratching my head on what approach I might be able to take to work around this without violating the base principles of DDD.  All the solutions that I have come up with, effictively bypass needing to access the items through the aggregate root in the first place, which is definitely not what I want to do.  I could, for example, put an "UpdateAddress" method in the repository, similar to what the InsertAddress() method is doing but of a more liberal scope, but I lose the enforcement of having to access this through the Company, itself.

Another approach that I have thought of would be to have a specific implementation of IEntity for all entities that are not the aggregate root, containing simple properties called PendingDelete and PendingUpdate, which would return / accept a boolean value.  This way, as updates are applied to the Company, for example, I could enumerate the Address entities and execute any inserts, updates, and deletes as needed.  Once the .Commit() is fired, they should all  You will notice that there is not a "PendingInsert", because it is not needed.  This could easily be determined by the lack of a key for the Address.  Unless someone else has an idea, this approach would probably be the one that I take.  The only thing that I do not like is that the determination of what action to take requires a manual determination and is not "detected" by the repository automatically.

Any ideas or suggestions would be helpful!

p.s.  I love the book, Tim.  Don't take this as complaining - I am just trying to rationalize a more concise way of doing something.  ;-)

Oct 24, 2009 at 6:46 PM

What exactly do you mean by "waste"? If you delete your value objects and insert them on every update, you use the some of the code you already need for inserts and deletes. So you are not wasting code.

Could it be that you are concerned about execution speed? Is this a real problem, or a potential one? Do you really want to trade simple code with complex code unless you need to?

 

Jan 15, 2010 at 7:58 AM
Edited Jan 15, 2010 at 7:59 AM

I think that by "wasteful", I am implying the additional overhead required to complete the operation of updating extant entities versus deleting and inserting.  I did not mean to imply anything negative. :-)

The problem could be two-fold, actually.  First, there is the potential for churn, as mentioned.  Frequently updated entities are continually being deleted and inserted.  If there is a scenario where there is only one property that has changed, it is significantly more efficient to perform the smaller update.  Secondly, in a high-usage application, there is an additional concern about network bandwidth usage - specifically for distributed users over largely diverse geographical locations.  Granted, that this is not an issue for everyone, but serializing large and complex entities for use through a WCF service could be a pain point.  Whereas, on the other hand, grabbing deltas for heavily edited data and passing that along would go a far way in improving this potential issue.

Since I have originally written the first post in this thread, I have actually implemented something that meets my needs and does not add much complexity.  My EntityBase implements INotifyPropertyChanged and exposes a PropertyChangedEventHandler.  It is very similar in implementation to the following example:  http://monotorrent.blogspot.com/2009/12/yet-another-inotifypropertychanged-with_06.html  By doing this, I map my property declarations in my entities to a change notifier that can maintain a list of change events (name of property, previous value, new value).  So, I can now easily identify what needs to be done through my repository base class.  If an entity has no identifier, it is an insert operation.  The repository creates the identifier, assigns it to the entity, and queues it for insert.  If an entity has an identifier and has one or more queued changes, then it is an update.  Deletes still operate how they originally did.  The entities are only marginally more involved in setting up now (nothing that could not be eased by a simple Visual Studio plugin).  In fact, it is any more complex to implement, rather than slightly more typing.  Since I have to set up the change notifiers for each property, these take the place of the standard member declarations used for backing storage (and do exclude me from being able to use automatic property declarations).  For serialization purposes, the signatures of the properties remain unchanged.  It is in the constructor that I map the physical properties to the change notifier, for tracking changes.

My goal was definitely not to make the solution more complex than it needed to be, but to provide lighter-weight operations.  I think that I have accomplished this, and it meets my needs.  Personally, I have no problem adding complexity to the application architecture itself (i.e. the base classes) if the more visible code artifacts (i.e. the entities) do not need to be routinely modified.  The impact on daily operations is limited to adding a line with a lambda to map each property to a member declaration, which is a fair trade.