-
-
Notifications
You must be signed in to change notification settings - Fork 274
Add validation of domain objects #18
Comments
Hey, I watched your Golang UK talk on youtube, it was great 👍 Validation is an interesting topic... I'm running into some issues applying ddd concepts to my own little Go DDD experiment and validation is one of them. I'm curious if you have any thoughts about the whole "always-valid" domain entity debate. Personally I'm in the always valid camp, however the more I try putting this into practice in Go, the more resistance I seem to run into. I initially had code very similar to what you have here, where domain entities are allowed to be in some invalid state and you have to manually run If we forsake the use of DTOs and unmarshal json directly into domain models, then we've already instantiated a domain object that fails to enforce its own invariants.... which I think is very bad, but Go seems to encourage this. On the other hand, introducing DTOs will lead to a lot of code duplication and copying of DTO fields to and from domain objects making the code very brittle and (for large objects) slow. It's an interesting discussion... |
Thank you for watching my talk, I'm glad you liked it! I would also put myself in the same camp as you. Ideally, domain objects should not be able to exist in a state where their invariants are violated. There's nothing in Go, to the best of my knowledge, that can completely prevent you from putting your domain object in an inconsistent state*. While you could go to great lengths to achieve this - building small packages with interfaces for your domain objects to limit access to internal state - it would most likely not be very idiomatic. For goddd, I've tried to be pragmatic rather than clinging to DDD ideals that wouldn't make sense, i.e. when in Rome, do as the Romans do.
Not sure I understand what you mean here. What validation logic are you referring to? I'm leaning more towards the It's indeed an interesting discussion, and I'm still not sure what the optimal solution would be. If you have any suggestions, I'd love to hear them out. * Really looking forward seeing how dependently typed languages like Idris could let you catch violated invariants at compile-time. |
It’s a very good point. It’s always a struggle striking the right balance between pragmatism and idealism.
What I mean is checking that some data complies with some structural rules (i.e - min, max string length, utf-8 encoding, alphanumeric only… all of the things which the Once again this is from an idealist’s perspective, but I think that in addition to domain validation, there needs to be validation logic embedded in other layers also. Checking that incoming data is syntactically valid should be an application service concern and should be executed before enforcing invariants at the domain level. As another example, where would you validate a collection query with pagination parameters? It's most likely not an application service concern and most definitely not a domain concern. So you'd probably need presentation level validation to take care of that since pagination is really just another way of presenting domain data. As for domain validation, the only problem I see with extracting all behavioural validation into a Idris looks fascinating, thanks for the link. |
Excellent observation that I don't think is communicated clearly enough in the current state. Currently some syntactical validation is currently done in the application services, but there's few examples of domain validation. I think you might be right about |
Using interface for entities and vo's isn't necessary. Using getters and setters, you can achieve fairly robust object encapsulation for your domain objects, without straying too far from idiomatic go. A simple VO might look like this:
|
Just for clarity, I'm not endorsing using interfaces for this purpose. I think using accessor methods is a good compromise that prevents other packages from mucking up your invariants. Still, it won't protect you from doing that from within your package. One thing I like about it though is that it at least communicates immutability. I'd be open to review a PR if anyone's interested in making the change. |
This is a very interesting topic. I went by the non-pragmatic way when started the project, now I have a transport layer with gRPC DTOs, my domain objects with all fields unexported, with exported constructors and getters/idiomatic setters with business logic which is really good. But the really really painful part: a lot of DTOs with exported fields and bson tags to be able to store all this information to the database. The Domain layer is pretty well, but I have a lot of transformations between gRPC and MongoDB DTOs and is really really painful, tbh I think it's better to export all fields on the domain objects, add the needed tags to them and use the Validation method you are using. Much more pragmatic approach. The only problem is that when someone is developing around your domain objects should be warned to use the exported methods instead of accessing fields directly and run Validation manually. But 100% sure it's much better this than having 400 lines of DTOs and transformers I think. The transport layer DTOs are fine, but every time I'm writing a repository implementation I feel stupid, really, and I don't know how to manage this with unexported fields. |
@hectorgimenez We used to have a lot of internal debate about this very topic, so I feel your pain. In our case, we decided to use unexported fields and fully encapsulate the domain model in its own package (infrastructure, app, and ui code in other packages), and accept the fact that we'd have to explicitly write code to map from our domain model to our persistence model, and to view models. If your models (domain, persistence, view) are closely aligned, the mappings shouldn't be too painful. In our case, however, our domain model and persistence model don't align well, and the mapping code is significant. Tbh this is still a point of contention for us. Our domain model is complex, so having a correctly encapsulated model is worth the extra trouble. If your microservice is mainly CRUD, and you have few/no domain invariants to enforce, I would definitely skip DDD, and design your app as database centric. |
@eriklott thanks, at least I'm not alone on this 😂 The main problem is not the time investment on writing the code to map the domain model to our persistence model and transport one. For me the main problem is that each transformation is a big point of failure (forgot to add a field or wrongly mapped...), we have to test the transformations (or at least trying to cover in an acceptance test that pass through all the layers). And I'm not totally sure if adding this extra complexity worth in any way. But the most noticeable thing for me is that I feel fighting against the language, seems that I'm forcing it to do things that the language is not designed for, or at least is not giving me facilities. We have like 20 services and are working fine and pretty fast, but everyone on the team feels the same, and seems that the Go community is focused on other kind of things, it's difficult to see good software enterprise architectures on Golang. |
@hectorgimenez, 100%. This pretty much summarizes our experience as well. |
The current domain validation is a bit lacking. Let's discuss some alternatives in this issue:
The simplest way would be to go all stdlib and just extract the validation into its own function:
func Validate(c Cargo) error
It could be interesting to take a closer look at
validator
: https://github.com/go-playground/validatorThe text was updated successfully, but these errors were encountered: