-
Notifications
You must be signed in to change notification settings - Fork 245
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add DataAddressValidatorRegistry #3570
feat: add DataAddressValidatorRegistry #3570
Conversation
b3586e3
to
019ed67
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Before going further, let's discuss the comment inline
import static org.eclipse.edc.spi.dataaddress.KafkaDataAddressSchema.TOPIC; | ||
import static org.eclipse.edc.validator.spi.Violation.violation; | ||
|
||
public class KafkaDataAddressValidator implements Validator<DataAddress> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think Kafka-specific classes should go into a core module. Can we either move this out or make it a generic messaging validator that can be configured? For example, I don't think this class is necessarily specific to Kafka since it could be used with any messaging platform that supports topics (pub/sub) or queues.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can agree with this, I wanted to understand how deep to go with the refactoring.
Likely, every data type should have its own Schema
interface (for now they are both in the core-spi
modules) and a Validator
implementation (that currently I put in the validator-core
module).
Splitting this vertically, we could end up having, for every DataAddress
type, an spi
and a validator
extension, like:
dataaddress-httpdata-spi
- validator-dataaddress-httpdata
and
dataaddress-kafka-spi
- validator-dataaddress-kafka
with the spi
containing only the Schema
interface and the validator
containing the Validator
implementation and the registration extension.
wdyt @jimmarino ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That works for me. One relly minor nit would be to use "data-address" as opposed to "dataadress" since it is easier to read, but that is only a minor suggestion.
019ed67
to
ff2639d
Compare
Codecov ReportAttention:
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## main #3570 +/- ##
==========================================
- Coverage 72.35% 72.19% -0.17%
==========================================
Files 889 889
Lines 17781 17814 +33
Branches 1016 1015 -1
==========================================
- Hits 12866 12860 -6
- Misses 4479 4516 +37
- Partials 436 438 +2
☔ View full report in Codecov by Sentry. |
ff2639d
to
fe71c0d
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two small doc requests if you have time to put in before merging
...fka/src/main/java/org/eclipse/edc/validator/dataaddress/kafka/KafkaDataAddressValidator.java
Show resolved
Hide resolved
...p-data-spi/src/main/java/org/eclipse/edc/dataaddress/httpdata/spi/HttpDataAddressSchema.java
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
fe71c0d
to
46474a1
Compare
What this PR changes/adds
This PR seems larger than how it is, just because some file movements and renaming
Implement
DataAddressValidatorRegistry
, that is used by the aggregated services to ensure that source/destinationDataAddress
es are validWhy it does that
Assets (provider side) or TransferProcesses (consumer side) should not being created if their addresses are not valid.
Further notes
warning
message, because validation definitely needs to be in place but it should not block already working flows.HttpData
,HttpProxy
andKafka
DataAddressValidatorRegistry
as collaborator of theDataPlaneManager
, so every request would be validated with the same validator (leaving to the sink/source factoriesvalidate
method only custom validations)HttpDataAddress
has been moved todata-plane-http-spi
, as it is a type used specifically in the data-plane, no real use in the control planeLinked Issue(s)
Closes #3557
Please be sure to take a look at the contributing guidelines and our etiquette for pull requests.