-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Change subclass determiner to assignability determiner #7246
Change subclass determiner to assignability determiner #7246
Conversation
4204af0
to
f6c5db6
Compare
📝 Walkthrough📝 WalkthroughWalkthroughThe pull request introduces significant changes to the type handling mechanisms within the Nussknacker codebase. A new object, Additionally, several existing methods across various files, including Suggested labels
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
🧹 Outside diff range and nitpick comments (40)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala (1)
25-26
: Correct grammatical error in the code comment for clarity.The code comment contains a grammatical error. "Given to this we cannot be sure..." should be corrected to "Given this, we cannot be sure..." for better clarity.
Apply this diff to correct the comment:
- * Given to this we cannot be sure if conversion is safe or not based on type (without scale knowledge). + * Given this, we cannot be sure if conversion is safe or not based on type (without scale knowledge).components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala (5)
39-39
: Typo in method nameisNullAsignableTo
The method name
isNullAsignableTo
appears to have a typographical error. It should beisNullAssignableTo
to match standard spelling.Apply this diff to correct the method name:
- case (TypedNull, other) => isNullAsignableTo(other) + case (TypedNull, other) => isNullAssignableTo(other)And update the method definition:
- private def isNullAsignableTo(to: TypingResult): ValidatedNel[String, Unit] = to match { + private def isNullAssignableTo(to: TypingResult): ValidatedNel[String, Unit] = to match {
53-54
: Address the TODO regardingTypedNull
subclassingThere's a TODO comment indicating that
TypedNull
should not be a subclass of a typed map with all values assigned:// TODO: Null should not be subclass of typed map that has all values assigned.
Consider implementing the logic to enforce this constraint to enhance type safety and prevent potential runtime errors.
Would you like assistance in implementing this functionality or creating a GitHub issue to track this task?
135-135
: Ensure correct combination of validations usingandThen
andcombine
In combining validations within
isSingleAssignableToSingle
, the use ofandThen
andcombine
should be carefully evaluated:objTypeRestriction andThen (typedObjectRestrictions combine dictRestriction combine taggedValueRestriction combine dataValueRestriction)Ensure that this combination correctly accumulates validation errors and that the sequencing aligns with the intended logic. If all validations are independent, you might consider using
product
or reordering to make the flow clearer.
217-224
: Add documentation to theConversionChecker
traitThe
ConversionChecker
trait plays a key role in type conversion logic. Adding ScalaDoc comments to the trait and its methods would improve code readability and help others understand its purpose and usage.For example:
/** * Trait for checking if a type can be converted to another type according to specific conversion rules. */ private sealed trait ConversionChecker { /** * Determines if a type can be converted to another type. * * @param from The source type. * @param to The target `TypedClass`. * @return Validation result. */ def isConvertable( from: SingleTypingResult, to: TypedClass ): ValidatedNel[String, Unit] }
151-176
: Simplify the logic intypeParametersMatches
methodThe
typeParametersMatches
method has complex pattern matching and logic for different collection types. Refactoring this method or adding inline comments would enhance readability and maintainability.Consider breaking down the method into smaller helper functions for each case (e.g., handling lists, maps, arrays) or adding comments explaining the purpose of each block.
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala (2)
31-32
: Clarify the example in the documentation comment.The example "e.g. long to int" may cause confusion since converting from a larger type to a smaller one (long to int) can lead to data loss. Consider using an example like "e.g. int to long" or "double to int" to better illustrate conversions with possible loss of precision.
472-474
: Simplify the use ofOption
iftypingResult
is non-nullable.If
typingResult
is guaranteed to be non-null, wrapping it withOption
is unnecessary. You can simplify the code by removingOption(typingResult)
and directly usingSome(typingResult)
.Apply this diff to simplify the code:
- Option(typingResult) + Some(typingResult) .filter(_.canBeConvertedTo(Typed.fromDetailedType[T])) .map(new TypingResultTypedValue(_))components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala (1)
18-21
: Consider documenting the type conversion behavior.The fallback logic for handling union types and type conversions is non-trivial. Consider adding comments to explain:
- When Integer conversion is preferred over Double
- The implications of using Double as a fallback for Number types
- Potential precision considerations
else if (typ == Typed[java.math.BigDecimal]) java.math.BigDecimal.ZERO + // Handle union types that can be safely converted to Integer else if (typ.canBeConvertedTo(Typed[java.lang.Integer])) java.lang.Integer.valueOf(0) + // Use Double as a safe fallback for all Number types, noting potential precision loss else if (typ.canBeConvertedTo(Typed[Number])) java.lang.Double.valueOf(0) else throw new IllegalArgumentException(s"Not expected type: ${typ.display}, should be Number")engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala (1)
Line range hint
23-25
: Consider documenting the error message format decisionThe comment suggests uncertainty about using
LogicalType.toString
. Consider:
- Documenting the rationale for using
expectedTypingResult.display
- Adding a comparison of both formats to help future maintainers
Would you like me to help create documentation that explains the differences between these formats and the reasoning behind the current choice?
engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala (1)
48-50
: LGTM! Consider adding a code comment for clarity.The change from subclass checking to conversion checking aligns well with the PR objectives and should help prevent incorrect type returns. The additional
params.nonEmpty
check provides extra type safety.Consider adding a brief comment explaining the type checking logic:
private def returnType(elements: LazyParameter[util.Collection[AnyRef]]): typing.TypingResult = elements.returnType match { case tc: SingleTypingResult + // Ensure we have a Collection type with defined element type parameter if tc.runtimeObjType.canBeConvertedTo( Typed[util.Collection[_]] ) && tc.runtimeObjType.params.nonEmpty =>
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala (2)
13-35
: Consider adding more widening conversion test cases.The existing test cases are well-structured and cover basic scenarios. Consider enhancing coverage with:
- Additional primitive type conversions (e.g., Byte → Short → Int)
- Nullable type conversions
- Union type conversions
Example additions:
val wideningConversionCases = Table( ("sourceType", "targetType", "expectedStrict", "expectedLoose"), (Typed[Int], Typed[Int], Valid(()), Valid(())), (Typed[Int], Typed[Double], Valid(()), Valid(())), + (Typed[Byte], Typed[Short], Valid(()), Valid(())), + (Typed[Option[Int]], Typed[Option[Double]], Valid(()), Valid(())), + (Typed[Either[String, Int]], Typed[Either[String, Double]], Valid(()), Valid(())), // ... existing cases ... )
45-65
: Enhance test implementation for better error reporting and readability.The test implementation could be improved in several ways:
- Make test names more descriptive of the scenarios being tested
- Add more specific assertions for error messages
- Use more explicit success/failure conditions
Example improvements:
- test("isAssignableStrict should fail for narrowing numerical cases") { + test("isAssignableStrict should fail with appropriate error messages for narrowing numerical conversions") { forAll(narrowingConversionCases) { (sourceType, targetType, expectedStrict, _) => val result = AssignabilityDeterminer.isAssignableStrict(sourceType, targetType) result match { - case Valid(_) if expectedStrict.isValid => succeed - case Invalid(_) if expectedStrict.isInvalid => succeed + case Valid(_) if expectedStrict.isValid => + succeed + case Invalid(errors) if expectedStrict.isInvalid => + errors.head should not be empty + errors.head should include("Cannot strictly convert") case _ => fail(s"Unexpected result: $result for types $sourceType -> $targetType") } } }components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala (2)
17-17
: Consider removing redundant import.The import alias
isAssignable
appears redundant sinceisAssignableLoose
is already specifically imported. Having both imports might cause confusion about which method is being used in the tests.-import AssignabilityDeterminer.isAssignable
Line range hint
21-80
: LGTM! Comprehensive test coverage for type assignability.The test cases effectively validate the new assignability logic across various scenarios:
- Record field type compatibility
- Nested type conversions
- Collection type compatibility
- Tagged type handling
- Value-specific type checks
- Null type scenarios
The error messages are clear and descriptive, which will help developers identify type-related issues quickly.
Consider adding test cases for:
- Generic type variance (covariance/contravariance)
- Type parameter bounds
- Complex type hierarchies
These would further strengthen the type system's robustness.designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala (1)
59-67
: Consider documenting type conversion rules and migration path.The introduction of strict type conversion is a significant architectural change that may affect existing dictionary definitions. Consider:
- Documenting the new type conversion rules for maintainers
- Providing migration guidelines for existing dictionary definitions
- Monitoring performance impact of stricter type checking in production
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala (2)
Line range hint
15-19
: Update error messages and class names to reflect new assignability semanticsWhile the change from
canBeSubclassOf
tocanBeConvertedTo
aligns with the PR objectives, the error messages and class names still reference "subclass" terminology. Consider updating:
- The object name from
MethodTypeInfoSubclassChecker
toMethodTypeInfoAssignabilityChecker
- Error case classes like
NotSubclassVarArgument
toInvalidVarArgAssignment
- Error messages to use "assignable to" instead of "subclass of"
-object MethodTypeInfoSubclassChecker { +object MethodTypeInfoAssignabilityChecker {
Line range hint
1-91
: Consider complete terminology alignment in a follow-up PRWhile the functional changes from subclass checking to assignability checking are well implemented, there's an architectural inconsistency in the codebase terminology. Consider creating a follow-up PR to:
- Rename the file to
MethodTypeInfoAssignabilityChecker.scala
- Update all error case classes to use assignability terminology
- Update test files and documentation to reflect the new terminology
- Add migration notes for any public API changes
This would ensure consistent terminology throughout the codebase and improve maintainability.
designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala (2)
98-100
: LGTM: Clean schema definitionsThe schema definitions are well-structured and use appropriate derivation. Consider adding scaladoc comments to document the expected JSON structure, especially for the
dictListRequestDtoSchema
since it handles raw JSON input.+ /** Schema for DictListRequestDto + * Expected JSON structure: + * { + * "expectedType": <json representation of type> + * } + */ implicit lazy val dictListRequestDtoSchema: Schema[DictListRequestDto] = Schema.derived[DictListRequestDto]
94-95
: Consider addressing the TODO commentThe TODO comment indicates that dictionary labels currently equal their IDs. Since we're making changes to the dictionary system, this might be a good opportunity to introduce proper labeling.
Would you like me to help design a more robust labeling system or create a GitHub issue to track this enhancement?
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala (1)
Line range hint
99-107
: Consider using a safer error handling approach.While the type checking logic is correct, throwing an
AssertionError
in production code might be risky. Consider handling this case through the existing validation mechanism.Here's a suggested approach:
- if (!typesFromStaticMethodInfo.exists(calculated.canBeConvertedTo)) { - val expectedTypesString = typesFromStaticMethodInfo.map(_.display).mkString("(", ", ", ")") - val argumentsString = arguments.map(_.display).mkString("(", ", ", ")") - throw new AssertionError( - s"Generic function $name returned type ${calculated.display} that does not match any of declared types $expectedTypesString when called with arguments $argumentsString" - ) - } + if (!typesFromStaticMethodInfo.exists(calculated.canBeConvertedTo)) { + val expectedTypesString = typesFromStaticMethodInfo.map(_.display).mkString("(", ", ", ")") + val argumentsString = arguments.map(_.display).mkString("(", ", ", ")") + return GenericFunctionTypingError.ReturnedTypeError( + s"Generic function $name returned type ${calculated.display} that does not match any of declared types $expectedTypesString when called with arguments $argumentsString" + ).invalidNel + }This approach:
- Uses the existing error handling mechanism
- Maintains the detailed error message
- Avoids runtime assertions in production code
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala (2)
Line range hint
95-97
: Update error message to reflect new assignability approachThe error message still refers to "subclass" which is inconsistent with the new assignability-based approach.
- s"DictKeyExpression expected type: ${expectedType.display} is unsupported. It must be a subclass of Long, Boolean or String" + s"DictKeyExpression expected type: ${expectedType.display} is unsupported. It must be assignable to Long, Boolean or String"
Line range hint
88-97
: Consider enhancing error handling and type conversion robustnessThe current implementation could benefit from more robust error handling:
- The type conversions using
toLong
andtoBoolean
could throw exceptions- The error states are not clearly differentiated between type compatibility vs conversion failures
Consider this more robust implementation:
override def evaluate[T](ctx: Context, globals: Map[String, Any]): T = { - if (expectedType.canBeConvertedTo(Typed[Long])) { - key.toLong.asInstanceOf[T] - } else if (expectedType.canBeConvertedTo(Typed[Boolean])) { - key.toBoolean.asInstanceOf[T] - } else if (expectedType.canBeConvertedTo(Typed[String])) { - key.asInstanceOf[T] - } else { - throw new IllegalStateException( - s"DictKeyExpression expected type: ${expectedType.display} is unsupported. It must be a subclass of Long, Boolean or String" - ) + Try { + if (expectedType.canBeConvertedTo(Typed[Long])) { + key.toLong.asInstanceOf[T] + } else if (expectedType.canBeConvertedTo(Typed[Boolean])) { + key.toBoolean.asInstanceOf[T] + } else if (expectedType.canBeConvertedTo(Typed[String])) { + key.asInstanceOf[T] + } else { + throw new IllegalArgumentException( + s"DictKeyExpression expected type: ${expectedType.display} is unsupported. It must be assignable to Long, Boolean or String" + ) + } + }.getOrElse( + throw new IllegalArgumentException( + s"Cannot convert value '$key' to expected type: ${expectedType.display}" + ) + ) }components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala (1)
85-85
: Consider renaming the helper method for clarityThe method name
checkNotASubclassOfOtherParamTypingResult
still references the old subclass terminology. Consider renaming it to better reflect its new purpose, such ascheckNotConvertibleToOtherParamTypingResult
.- def checkNotASubclassOfOtherParamTypingResult(obj: Any, otherParamTypingResult: TypingResult): Unit = { + def checkNotConvertibleToOtherParamTypingResult(obj: Any, otherParamTypingResult: TypingResult): Unit = {engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala (1)
65-66
: LGTM! Consider enhancing the error message.The change from subclass checking to convertibility checking aligns well with the PR objectives and provides more accurate type compatibility verification.
Consider making the error message more specific about number type conversion:
- case (l, r) if List(l, r).forall(_.canBeConvertedTo(numberType)) => - OtherError(s"Addition of ${l.display} and ${r.display} is not supported").invalidNel + case (l, r) if List(l, r).forall(_.canBeConvertedTo(numberType)) => + OtherError(s"Addition of ${l.display} and ${r.display} is not supported. Both types must be exactly Int or Double").invalidNelengine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala (1)
Line range hint
144-156
: Consider improving type safety of the zip methodWhile the change improves type compatibility, the method could benefit from more specific return types than
Any
. The current implementation returns either:
- Single Number for one argument
- Record with left/right for two arguments
- Record with left/mid/right for three arguments
Consider using sealed traits or case classes to make the return types more explicit:
sealed trait ZipResult case class SingleZip(value: Number) extends ZipResult case class DoubleZip(left: Number, right: Number) extends ZipResult case class TripleZip(left: Number, mid: Number, right: Number) extends ZipResultThis would provide better type safety and clearer API documentation.
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala (1)
Line range hint
133-146
: Consider enhancing type safety in error combination logic.The current error combination logic uses runtime type checking (
isInstanceOf
) and casting (asInstanceOf
). This could be made more type-safe using sealed traits or pattern matching.Consider refactoring the error combination logic to use pattern matching:
-case xs if xs.forall(_._2.isInstanceOf[ArgumentTypeError]) => - xs.map(_._2.asInstanceOf[ArgumentTypeError]).toList.reduce(combineArgumentTypeErrors) +case xs @ NonEmptyList((_, _: ArgumentTypeError), _) if xs.forall { case (_, err) => err.isInstanceOf[ArgumentTypeError] } => + xs.map { case (_, err: ArgumentTypeError) => err }.toList.reduce(combineArgumentTypeErrors)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala (3)
85-86
: Address TODO regarding ValidationModeThe double type check (
canBeConvertedTo(Typed[Number])
andcanBeConvertedTo(targetType.toTypingResult)
) provides good type safety. However, there's a TODO comment indicating this behavior should vary based on ValidationMode.Would you like me to help implement the ValidationMode-specific behavior? This could involve:
- Adding a ValidationMode parameter
- Implementing strict/loose validation logic
- Adding corresponding test cases
108-111
: Consider reducing pattern match duplicationThe same condition
valueType.canBeConvertedTo(Typed[Int])
appears in two similar pattern matches. Consider extracting a common helper method to reduce duplication and improve maintainability.+ private def isValidMultisetValueType(valueType: TypingResult): Boolean = + valueType.canBeConvertedTo(Typed[Int]) case ( TypedObjectTypingResult(_, TypedClass(`javaMapClass`, keyType :: valueType :: Nil), _), - multisetType: MultisetType - ) if valueType.canBeConvertedTo(Typed[Int]) => + multisetType: MultisetType + ) if isValidMultisetValueType(valueType) => alignMultisetType(keyType, multisetType) case (TypedClass(`javaMapClass`, keyType :: valueType :: Nil), multisetType: MultisetType) - if valueType.canBeConvertedTo(Typed[Int]) => + if isValidMultisetValueType(valueType) => alignMultisetType(keyType, multisetType)
Line range hint
29-111
: Architecture Review: Type Conversion StrategyThe changes successfully implement the transition from subclass determination to assignability determination, improving type conversion accuracy. The implementation:
- Handles numeric types more precisely
- Maintains type safety through double-checking
- Properly handles complex types like multisets
The approach aligns well with the PR objectives of fixing incorrect subtype handling.
Consider documenting the type conversion strategy in a central location, possibly in the package documentation, to help maintainers understand:
- The difference between strict and loose conversion
- The role of ValidationMode
- Common pitfalls and edge cases
utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala (1)
132-134
: Consider adding test cases for edge cases.Given that this PR aims to fix specific issues with Integer/Long/Number type conversions, consider adding test cases that verify:
- Long type dictionary handling for Integer inputs
- Number type dictionary handling for Long inputs
- Edge cases around type promotion and conversion
Would you like me to help draft these test cases?
designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala (1)
19-19
: Consider adding test cases for edge cases in type conversion.To ensure robust type handling, consider adding test cases for:
- Types with generic parameters
- Edge cases in numeric type conversion (e.g., overflow scenarios)
- Complex type hierarchies
Would you like me to provide example test cases for these scenarios?
engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala (2)
356-358
: LGTM: Improved Map type validationThe type validation for Maps has been enhanced to use convertibility checking. Consider making the error message more specific about the expected Map type.
Consider updating the error message to be more specific:
- "Fields do not match, aggregateBy: ${inputFields.keys.mkString(", ")}, aggregator: ${scalaFields.keys.mkString(", ")}" + s"Fields do not match. Expected a Map[String, _] with fields: ${scalaFields.keys.mkString(", ")}, but got: ${inputFields.keys.mkString(", ")}"
442-442
: LGTM: Improved numeric type handling in MathAggregatorThe change to use
canBeConvertedTo
with Number type improves type compatibility across different numeric types, working well with the configurable promotion strategy.Consider adding a code comment explaining the promotion strategy's role:
override def computeOutputType(input: typing.TypingResult): Validated[String, typing.TypingResult] = { + // Validates numeric type compatibility and promotes types according to the aggregator's strategy + // e.g., Sum promotes to larger types, Min/Max preserves original types when possible if (input.canBeConvertedTo(Typed[Number])) {components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala (3)
145-150
: Consider adding edge cases for number type conversions.While the current tests cover basic number type conversions, consider adding tests for:
- Overflow scenarios
- Decimal precision loss
- Special values (Infinity, NaN)
// Add tests for edge cases Typed[Int].canBeConvertedTo(Typed[Short]) shouldBe false // Potential overflow Typed[Double].canBeConvertedTo(Typed[Float]) shouldBe false // Precision loss
340-345
: Consider adding tests for boundary values.While the current tests cover basic instance value conversions, consider adding tests for boundary values:
// Add boundary value tests Typed.fromInstance(Int.MaxValue).canBeConvertedTo(Typed.typedClass[Long]) shouldBe true Typed.fromInstance(Long.MaxValue).canBeConvertedTo(Typed.typedClass[Int]) shouldBe false
452-452
: Consider adding failure case assertions in property-based tests.The property-based tests verify type compatibility well, but could benefit from explicit failure cases:
// Add negative test cases forAll(TypingResultGen.typingResultGen(EnabledTypedFeatures.All)) { input => whenever(!input.isInstanceOf[Unknown.type]) { input.canBeConvertedTo(TypedNull) shouldBe false } }Also applies to: 466-470, 486-491
utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala (1)
Line range hint
4-4
: Add tests for numeric type conversion scenarios.The TODO comment indicates missing tests. Given the recent changes to type conversion logic, it's particularly important to add tests covering various numeric type conversion scenarios.
Would you like me to help create test cases for the numeric type conversion scenarios? I can generate test cases covering:
- Integer to Long conversion
- Long to Number conversion
- Various numeric type combinations in collections
docs/MigrationGuide.md (2)
82-82
: Fix list indentation for better readabilityThe unordered list indentation is inconsistent. Lists should be indented with 2 spaces instead of 4 for better readability and compliance with markdown standards.
- * `CanBeSubclassDeterminer.canBeSubclassOf` changed to - AssignabilityDeterminer.isAssignableLoose. + * `CanBeSubclassDeterminer.canBeSubclassOf` changed to + AssignabilityDeterminer.isAssignableLoose.🧰 Tools
🪛 Markdownlint (0.35.0)
82-82: Expected: 2; Actual: 4
Unordered list indentation(MD007, ul-indent)
Line range hint
1-1000
: Consider enhancing migration instructionsWhile the guide provides comprehensive information about changes, some sections could benefit from more explicit migration instructions. Consider adding:
- Step-by-step migration guides for major changes
- Code examples showing before/after changes
- Common pitfalls to watch out for during migration
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (38)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
(1 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/CanBeSubclassDeterminer.scala
(0 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala
(1 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala
(5 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala
(2 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala
(1 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala
(2 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala
(4 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala
(7 hunks)designer/client/src/http/HttpService.ts
(1 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
(1 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala
(2 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/process/test/ScenarioTestService.scala
(2 hunks)designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
(2 hunks)designer/server/src/test/scala/pl/touk/nussknacker/ui/validation/UIProcessValidatorSpec.scala
(1 hunks)docs-internal/api/nu-designer-openapi.yaml
(1 hunks)docs/MigrationGuide.md
(2 hunks)engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
(1 hunks)engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala
(4 hunks)engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala
(1 hunks)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala
(1 hunks)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala
(3 hunks)engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala
(2 hunks)engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala
(1 hunks)engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/compile/nodecompilation/FragmentParameterValidator.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionValidator.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala
(11 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala
(1 hunks)utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala
(1 hunks)utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala
(1 hunks)utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala
(1 hunks)utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala
(1 hunks)
💤 Files with no reviewable changes (1)
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/CanBeSubclassDeterminer.scala
✅ Files skipped from review due to trivial changes (1)
- designer/client/src/http/HttpService.ts
🧰 Additional context used
📓 Learnings (1)
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala (1)
Learnt from: arkadius
PR: TouK/nussknacker#7174
File: scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala:84-84
Timestamp: 2024-11-19T22:01:38.151Z
Learning: In test classes such as `SpelExpressionSpec.scala` within the `scenario-compiler` module, binary compatibility issues are not a concern because they are used only internally. Therefore, changes to method signatures in these test classes are acceptable.
🪛 Markdownlint (0.35.0)
docs/MigrationGuide.md
82-82: Expected: 2; Actual: 4
Unordered list indentation
(MD007, ul-indent)
🔇 Additional comments (59)
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/compile/nodecompilation/FragmentParameterValidator.scala (1)
Line range hint 172-176
: Verify the change to 'canBeConvertedTo' for accurate type validation
The substitution of canBeSubclassOf
with canBeConvertedTo
shifts the type compatibility check from subclassing to conversion capability. This alteration may impact the validation logic for dictionary value types, especially in cases involving implicit conversions or type coercion. Please ensure that this change aligns with the intended validation behavior and does not introduce unintended type acceptances or rejections.
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala (1)
144-144
: Good improvement: Use of canBeConvertedTo
enhances type compatibility checks
The change from canBeSubclassOf
to canBeConvertedTo
in the method deduplication logic improves the flexibility of type assignability checks, allowing for more accurate method selection based on type conversions.
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala (2)
30-36
: Implementation of canBeConvertedTo
is correct.
The method correctly utilizes AssignabilityDeterminer.isAssignableLoose
to check for type conversions with possible loss of precision.
37-41
: Implementation of canBeStrictlyConvertedTo
is correct.
The method appropriately uses AssignabilityDeterminer.isAssignableStrict
to ensure type conversions without loss of precision.
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala (2)
414-415
: Update to canBeConvertedTo
enhances type flexibility
Changing from canBeSubclassOf
to canBeConvertedTo
improves the type checking logic by considering type conversion possibilities, which enhances accuracy when determining if a type can be treated as a java.util.Collection
.
Line range hint 416-420
: Consistent use of canBeConvertedTo
for java.util.Map
types
Applying canBeConvertedTo
in this case ensures consistent type checking logic for java.util.Map
types, aligning with the updated assignability determiner logic and improving type compatibility handling.
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala (10)
149-150
: Validation of child types in withChildrenOfType
The code correctly checks whether all elements of the list can be converted to the expected type Parts
. This ensures type safety when processing child nodes.
201-201
: Proper index key type checking in record indexer
The condition accurately checks if the index key can be converted to a String
, which is appropriate for indexing records with string keys.
Line range hint 359-368
: Accurate handling of unary and binary minus operations
The implementation correctly handles both binary (left - right
) and unary (-left
) minus operations. It ensures that operands are numbers and provides appropriate error handling when types do not match.
398-409
: Comprehensive handling of the '+' operator
The code effectively handles string concatenation when either operand is a String
(line 398), numerical addition when both operands are numbers (line 402), and unary plus (line 409). It also includes proper error handling for type mismatches.
451-451
: Ensuring selection predicate returns Boolean
In the Selection
case, the code correctly ensures that the result of the selection predicate can be converted to a Boolean
, which is essential for filtering elements.
462-462
: Validating ternary operator condition type
The code appropriately filters the condition in the ternary operator to ensure it can be converted to a Boolean
, preventing erroneous behavior.
523-526
: Determining selection typing result based on iterable type
The function resolveSelectionTypingResult
correctly identifies whether the parent type is an array or a collection and adjusts the selection typing result accordingly. It also handles the case where the parent type is a Map
.
578-578
: Validating numeric operands in arithmetic operations
The function checkTwoOperandsArithmeticOperation
ensures that both operands can be converted to Number
before performing arithmetic operations, thereby enforcing type safety.
597-597
: Ensuring operand is numeric in unary operations
In checkSingleOperandArithmeticOperation
, the code verifies that the operand can be converted to Number
for unary operations, such as increment or decrement.
693-696
: Extracting iterative type from parent type
The extractIterativeType
method correctly checks if the parent type can be converted to a collection or a map and extracts the appropriate element type for further processing.
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala (2)
18-18
: LGTM: Integer conversion fallback looks correct.
The change from canBeSubclassOf
to canBeConvertedTo
for Integer type handling aligns with the PR's objective to improve type conversion accuracy. This change appropriately handles union types that can be converted to Integer.
20-20
: Verify the Number type conversion behavior.
The change to use canBeConvertedTo
for Number type checking is logically sound, as Double can safely represent any numeric value. However, we should verify that this change doesn't introduce any precision loss issues in the codebase.
✅ Verification successful
Based on the search results, I can make a determination about the number type conversion behavior. Let me generate the final response:
The Number type conversion is safe in this context
The codebase shows that:
- The
zeroForType
method is primarily used in aggregation contexts where precision is not critical for the zero value - The
canBeConvertedTo
method is explicitly documented to allow precision loss, with a separatecanBeStrictlyConvertedTo
method available when precision must be preserved - The conversion to Double is only used for the zero value (0.0), which can precisely represent integers and is a reasonable default
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Search for places where this conversion might affect precision-sensitive calculations
# Look for test cases involving numeric precision
rg -A 5 "precision|accuracy|exact" --type scala
# Look for BigDecimal usage which might be affected
ast-grep --pattern 'BigDecimal.$$$'
# Look for potential numeric conversion test cases
rg -A 5 "NumberTypeUtils.zeroForType" --type scala
Length of output: 49678
engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala (2)
17-17
: LGTM: Logic change aligns with new type system approach
The change from canBeSubclassOf
to canBeConvertedTo
correctly implements the new assignability determination approach, making the validation more accurate for type conversions.
Line range hint 16-29
: Verify error message consistency across type system changes
Since we've moved from subclass checking to conversion checking, let's verify that the error messages remain consistent and helpful across different scenarios.
✅ Verification successful
Let me generate additional scripts to gather more information about the test cases and error handling:
Let me try one more search to find test cases and error handling patterns:
Based on the gathered information, I can now provide the final response:
Error message consistency is maintained after type system changes
The error handling in TableTypeOutputValidator
maintains consistency with the rest of the codebase by:
- Using the standard
OutputValidatorTypeError
structure shared across different validators (JSON, Avro, Table) - Properly displaying type information through
expectedTypingResult.display
- Aligning types before validation using
ToTableTypeSchemaBasedEncoder.alignTypingResult
The error messages remain helpful as they:
- Include both actual and expected types
- Maintain a consistent error structure across different type validators
- Handle type conversion attempts before reporting errors
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for other validation error messages to ensure consistency
rg -A 2 "OutputValidatorTypeError"
# Look for test cases covering error scenarios
ast-grep --pattern 'test("$_") { $$ OutputValidatorTypeError $$ }'
Length of output: 5546
Script:
#!/bin/bash
# Look for test files related to TableTypeOutputValidator
fd "TableTypeOutputValidator.*Test.*\.scala$"
# Search for test cases and assertions involving type validation
rg -B 2 -A 4 "TableTypeOutputValidator" --type scala
# Look for error message assertions in tests
rg -A 3 "shouldBe|should be|assertEquals.*Error" --type scala | grep -B 2 -A 2 "expected|actual|type|error"
Length of output: 3762
Script:
#!/bin/bash
# Search for test files in the table component directory
fd "Test.*\.scala$" engine/flink/components/table/
# Look for test cases involving type validation and error messages
rg -B 3 -A 5 "validate.*Type|expectedTypingResult|actualType" engine/flink/components/table/
# Search for error handling in the encoder
rg -A 3 "ToTableTypeSchemaBasedEncoder" --type scala
Length of output: 19466
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionValidator.scala (2)
26-26
: LGTM! Change aligns with the new assignability-based type system.
The replacement of canBeSubclassOf
with canBeConvertedTo
correctly implements the new type validation approach, which should fix the issues with incorrect subtype handling mentioned in the PR objectives.
26-26
: Verify consistent type validation behavior across the codebase.
Since this is part of a broader change from subclass to assignability checking, we should verify that all related type checks are using the new approach consistently.
✅ Verification successful
Based on the shell script results, I can now generate the final response:
Type validation behavior is consistent across the codebase
The codebase shows a clear transition from canBeSubclassOf
to canBeConvertedTo
. The remaining instances of canBeSubclassOf
are either:
- In documentation/migration guides explaining the change
- In TODO comments referencing the transition
- In test files verifying the old behavior
- In specific validators (AvroSchemaOutputValidator, JsonSchemaOutputValidator) that have explicit TODO comments about moving away from the old approach
The new canBeConvertedTo
method is consistently used across the codebase for type validation, particularly in:
- SpEL expression validation and typing
- Parameter validation
- Type conversion checks
- Collection and map type handling
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for any remaining instances of old type checking methods
# and verify consistent usage of new methods across the codebase
# Check for any remaining instances of old method
echo "Checking for remaining canBeSubclassOf usage:"
rg "canBeSubclassOf"
# Verify consistent usage of new method
echo -e "\nVerifying usage of new canBeConvertedTo method:"
rg "canBeConvertedTo"
# Look for similar type checking patterns that might need updating
echo -e "\nChecking for other potential type checking patterns:"
ast-grep --pattern 'if $expr.can$$$Of($target)'
Length of output: 26664
engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala (1)
45-47
: LGTM! The type checking enhancement aligns with PR objectives.
The change from canBeSubclassOf
to canBeConvertedTo
should improve type compatibility checks, particularly for cases involving numeric types like Long and Integer mentioned in the PR objectives.
Let's verify the type conversion behavior:
✅ Verification successful
Based on the extensive search results from the previous queries, particularly the implementation details showing canBeConvertedTo
usage across the codebase and its test cases in TypingResultSpec
, I can now make a definitive assessment.
The change from canBeSubclassOf
to canBeConvertedTo
is correctly implemented and well-tested.
The evidence shows that:
- The
canBeConvertedTo
method is extensively tested for Collection types inTypingResultSpec
- The implementation properly handles type conversion for collections, as seen in multiple test cases verifying collection type compatibility
- The change aligns with the type system's behavior across the codebase, particularly for handling numeric types and collections
The modification in ForEachTransformer.scala
is consistent with the broader type system implementation and testing patterns seen throughout the codebase.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the implementation of canBeConvertedTo and its usage
# Expected: Find related type conversion implementations and tests
# Check for type conversion implementations
ast-grep --pattern 'def canBeConvertedTo'
# Look for related test cases
rg -l "canBeConvertedTo.*Collection" --type scala -g "*Test*.scala"
Length of output: 113
Script:
#!/bin/bash
# Let's try a broader search for type conversion related code and tests
# Search for any occurrences of canBeConvertedTo
rg "canBeConvertedTo" --type scala -C 2
# Search for type conversion related test files
rg -l "TypedObjectTypingResult.*Collection" --type scala -g "*Test*.scala"
# Look for collection type handling tests
rg -l "Collection.*Type" --type scala -g "*Test*.scala"
# Search for the ForEachTransformer test file
fd "ForEachTransformer.*Test.*scala"
Length of output: 69625
Script:
#!/bin/bash
# Let's check for tests specifically related to ForEachTransformer and its type handling
# Search for ForEachTransformer tests
rg -A 10 "ForEachTransformer.*Test" --type scala
# Look for test cases involving Collection type checking
rg -A 5 "canBeConvertedTo.*Collection" --type scala -g "*Test*.scala"
# Check implementation of type conversion for collections
ast-grep --pattern 'def canBeConvertedTo($type) = {
$$$
}'
Length of output: 185
engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala (1)
48-50
: Verify integration with the new AssignabilityDeterminer
Let's ensure this component properly integrates with the new type system changes.
✅ Verification successful
Let me gather more information about the specific changes in the ForEachTransformer implementations to verify the integration.
Based on the code inspection, I can now provide a final response:
Integration with AssignabilityDeterminer is properly implemented
The changes are correctly integrated in both Flink and Lite implementations:
- Both engines use
canBeConvertedTo
consistently for Collection type checking - Test coverage confirms proper type handling and validation
- The implementation maintains consistent behavior across both engines
- Type parameters are properly extracted and validated in both implementations
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for other ForEach-related components that might need similar updates
rg -t scala "ForEach.*Transformer"
# Look for any remaining usage of old SubclassDeterminer patterns
rg -t scala "canBeSubclassOf|SubclassDeterminer"
Length of output: 4058
Script:
#!/bin/bash
# Check the implementation details of ForEachTransformer in both lite and flink engines
cat engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala
cat engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala
# Check the test coverage
cat engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformerSpec.scala
Length of output: 10659
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala (2)
1-10
: LGTM! Imports are well-organized and appropriate.
The imports cover all necessary testing frameworks and utilities needed for type validation tests.
1-67
: Consider adding edge case tests.
The test suite would benefit from additional edge cases:
- Generic type parameters
- Nested collection conversions
- Null handling
- Complex type hierarchies
Let's check if these cases are covered elsewhere in the codebase:
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala (1)
8-8
: LGTM! Import changes align with the new assignability approach.
The import of isAssignableLoose
is appropriately placed and reflects the architectural shift from subclass determination to assignability determination.
designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala (2)
59-59
: LGTM! Verify JSON schema compatibility.
The direct JSON decoding approach is cleaner. The error handling for malformed results is properly maintained.
Let's verify the JSON schema compatibility:
✅ Verification successful
Based on the search results, I can see that:
- The
expectedType
is properly handled by theTypingResultDecoder
inDictApiHttpService.scala
- The JSON schema for
DictListRequestDto
is correctly defined withexpectedType: Json
- The decoder properly validates and converts the JSON type information
- The type checking is done using
canBeStrictlyConvertedTo
which is the intended behavior for dictionary type validation - The error handling is in place for malformed typing results
LGTM! The JSON type decoding and validation is properly implemented.
The code correctly handles the JSON decoding of expected types and performs proper type validation against dictionary value types. The error handling is comprehensive, covering cases like malformed typing results and type mismatches.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for other JSON schema definitions or usages to ensure consistency
rg -t scala "expectedType.*Json" -B 2 -A 2
Length of output: 595
Script:
#!/bin/bash
# Let's check for the usage of TypingResultDecoder and related JSON decoding patterns
rg -t scala "TypingResultDecoder" -B 2 -A 2
# Also check for any other usages of expectedType in the codebase
rg -t scala "expectedType" -B 2 -A 2
Length of output: 72364
65-67
: LGTM! Verify test coverage for strict conversion cases.
The change from subclass checking to strict conversion aligns with the PR objectives and should prevent the incorrect subtype issues (e.g., Long for Integer). The strict conversion check provides better type safety.
Let's verify the test coverage:
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala (2)
41-43
: LGTM: Type conversion check is properly implemented
The change from subclass checking to assignability checking is correctly implemented for non-varargs parameters, maintaining the existing validation structure while improving type compatibility checks.
49-52
: Verify impact on return type inference
The change to use canBeConvertedTo
for return type validation is consistent with the new assignability approach. However, we should verify that this doesn't affect type inference in the calling code.
✅ Verification successful
Based on the extensive search results, I can now generate a final response. The test cases and usage patterns show that canBeConvertedTo
is the standard method used throughout the codebase for type compatibility checks, including return type validation.
Change to canBeConvertedTo
is consistent with type system
The change from canBeSubclassOf
to canBeConvertedTo
for return type validation is correct and consistent with the codebase's type system. The test cases in MethodTypeInfoSubclassCheckerSpec.scala
verify this behavior, particularly for numeric type conversions (e.g., Long to Number) and inheritance relationships. The method is used consistently across the codebase for type compatibility checks in various contexts including:
- Return type validation
- Type inference
- Numeric type promotions
- Collection type compatibility
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for usages of this checker where return types are involved
ast-grep --pattern 'MethodTypeInfoSubclassChecker.check($_, $_)' | rg -B 5 -A 5 'Result|return|Type'
# Look for any type inference related tests
fd -e scala -e java . | xargs rg -l 'TypeInference.*MethodTypeInfo'
Length of output: 2359
Script:
#!/bin/bash
# Let's examine the test cases more closely to understand the return type validation
ast-grep --pattern 'MethodTypeInfo($$$, $_, Typed[$_])' scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassCheckerSpec.scala
# Also check for any validation failures in tests
rg "NotSubclassResult.*Typed" scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassCheckerSpec.scala
# Look for any other return type validations in the codebase
rg "canBeConvertedTo.*Typed" -A 2 -B 2
Length of output: 43702
designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala (2)
19-20
: LGTM: Import changes are appropriate
The import changes correctly support the new JSON handling approach and schema definitions.
90-90
: Verify integration with AssignabilityDeterminer
The change from TypingResultInJson
to Json
aligns with the PR objectives. However, we should verify that the new AssignabilityDeterminer
correctly handles this raw JSON format.
✅ Verification successful
Let me gather more specific information about the usage of DictListRequestDto and its integration.
Let me search for the usage of DictListRequestDto.expectedType and its integration with AssignabilityDeterminer.
Let me try another search pattern to find the relevant code.
Based on the search results, I can now make a determination about the change from TypingResultInJson to Json for DictListRequestDto.expectedType.
Change is safe and properly integrated
The code shows that:
- The DictListRequestDto.expectedType is used in DictApiHttpService where it's properly decoded using Json decoder and then used for type compatibility checks
- The dictionaries are filtered based on type compatibility using
canBeStrictlyConvertedTo
method - The change aligns with the typing system's handling of dictionary types, as seen in the TypedDict and related type handling code
The change from TypingResultInJson to Json is a simplification that still maintains type safety through proper decoding and validation. The AssignabilityDeterminer integration is handled through the type system's built-in compatibility checks.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for places where DictListRequestDto.expectedType is used to ensure proper JSON handling
rg -l "DictListRequestDto.*expectedType" | xargs -I {} rg -A 5 "expectedType" {}
# Look for AssignabilityDeterminer usage with Json types
ast-grep --pattern 'AssignabilityDeterminer.$$$Json$$$'
Length of output: 6433
Script:
#!/bin/bash
# Search for DictListRequestDto usage and its handling
rg -A 10 "DictListRequestDto"
# Search for places where Json type is handled in relation to dictionaries
rg -A 5 "Json.*dict" -i
# Look for type validation or conversion logic
ast-grep --pattern 'def $_(expectedType: Json)$$$'
Length of output: 138661
Script:
#!/bin/bash
# Search for AssignabilityDeterminer usage with expectedType
ast-grep --pattern 'AssignabilityDeterminer.$$$expectedType$$$'
# Search for DictListRequestDto usage in relation to AssignabilityDeterminer
rg -A 5 "AssignabilityDeterminer.*DictListRequestDto"
# Look for places where expectedType is used in type validation
rg -A 5 "expectedType.*isValid"
Length of output: 5743
Script:
#!/bin/bash
# Search for DictListRequestDto usage and handling
rg -A 10 "DictListRequestDto" -g "*.scala"
# Search for dictionary-related endpoints that handle types
rg -A 10 "dictionaryList.*endpoint" -g "*.scala"
# Look for type validation in dictionary-related code
rg -A 10 "dict.*Type" -g "*.scala"
Length of output: 62136
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala (2)
46-46
: LGTM! Consistent varargs type checking.
The change maintains proper type checking for varargs parameters while adopting the new assignability determination approach.
38-41
: LGTM! Verify array-to-list conversion compatibility.
The change from subclass checking to assignability checking is well-implemented. The code maintains special handling for array-to-list conversions while adopting the new type system approach.
Let's verify the array-to-list conversion compatibility:
✅ Verification successful
Array-to-List conversion compatibility verified successfully
The verification confirms that array-to-list conversion is properly maintained:
- The
DefaultSpelConversionsProvider
explicitly registersArrayToListConverter
in the conversion service - Test cases in
SpelExpressionSpec
validate array-to-list conversion functionality:#arrayOfUnknown.toList()
test verifies the conversion works correctly#arrayOfUnknown.canBeList
test confirms type compatibility
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for array-to-list conversion test cases
rg -l "ArrayToListConverter|array.*List" --type scala
Length of output: 2243
Script:
#!/bin/bash
# Let's check the test cases in SpelExpressionSpec and DefaultSpelConversionsProvider for array-to-list conversion
rg "array.*List|ArrayToList" -A 5 -B 5 scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/internal/DefaultSpelConversionsProvider.scala
Length of output: 5008
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala (2)
88-92
: LGTM: Type conversion changes align with PR objectives
The changes from canBeSubclassOf
to canBeConvertedTo
align well with the PR's goal of improving type handling accuracy. This should help resolve the issues with incorrect subtype returns mentioned in the PR objectives.
88-92
: Verify consistent usage of type conversion approach
Let's ensure the new canBeConvertedTo
approach is consistently used across the codebase.
✅ Verification successful
Based on the search results, I can generate the final response as we have comprehensive information about both canBeSubclassOf
and canBeConvertedTo
usage across the codebase.
Type conversion approach is consistently implemented
The changes in DictKeyWithLabelExpressionParser.scala
from canBeSubclassOf
to canBeConvertedTo
align with the codebase's type system implementation. The search results show:
canBeConvertedTo
is widely used across the codebase for type compatibility checks, especially in type conversion scenarios- The remaining instances of
canBeSubclassOf
are either in tests or have TODO comments indicating planned migration - The implementation in
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala
serves as the central definition for type conversion checks
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for any remaining instances of canBeSubclassOf to ensure complete migration
rg "canBeSubclassOf" --type scala
# Search for new canBeConvertedTo usage to verify consistent implementation
rg "canBeConvertedTo" --type scala
Length of output: 25624
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala (2)
63-64
: LGTM! Test correctly validates empty list type conversion
The change from subclass checking to conversion checking aligns well with the PR's objective of improving type handling. The test appropriately covers both Scala and Java collections.
71-76
: Verify test coverage for edge cases in type conversion
The helper method correctly implements the new assignability checking approach. However, given the significant change in type system behavior, consider adding test cases for edge cases.
engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala (1)
113-114
: LGTM! Verify other string parameter handling.
The change to use canBeConvertedTo
for string type checking is appropriate and maintains the requirement for literal string values.
Let's verify that this is the only place where string parameter handling needs to be updated:
engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala (1)
150-150
: The type check change aligns with PR objectives but needs verification
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's goal of improving type handling, particularly for cases like Long/Integer compatibility. However, we should verify the behavior with edge cases.
Let's verify the impact on type handling:
Consider adding explicit test cases for:
- Mixed numeric types (Long, Integer, Double)
- Edge cases with large numbers that might lose precision
- Invalid type combinations
✅ Verification successful
Let me gather more context about the type conversion changes and test coverage.
Based on the extensive search results, I can now make a definitive assessment of the type checking change. The codebase shows consistent usage of canBeConvertedTo
for numeric type compatibility checks, particularly with Typed[Number]
, and there are comprehensive test cases verifying this behavior.
The type check change is safe and aligns with the codebase's type system design
The change from canBeSubclassOf
to canBeConvertedTo
is correct because:
canBeConvertedTo
is the standard method used throughout the codebase for type compatibility checks- The codebase has extensive test coverage for numeric type conversions using this method
- The implementation specifically handles Number type conversions with proper precision considerations
- The change aligns with other similar numeric validation patterns in the codebase, like in
utils/default-helpers/.../numeric.scala
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for test cases that verify numeric type conversions
rg -A 5 "zip.*Number" --type scala
# Look for potential issues with similar type conversions
ast-grep --pattern 'canBeConvertedTo(Typed[Number])'
Length of output: 1001
Script:
#!/bin/bash
# Search for test files related to ExampleFunctions
fd "ExampleFunctions.*Test.*\.scala$"
# Look for existing type conversion tests and usage patterns
rg -B 2 -A 4 "canBeConvertedTo" --type scala
# Find related test cases for numeric operations
rg -B 2 -A 4 "Typed\[Number\]" --type scala
Length of output: 100508
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala (1)
65-65
: Verify the impact of switching from subclass to assignability check.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's objective of moving from subclass determination to assignability determination. This affects how we determine if a type is a Class, which is crucial for static method resolution.
Let's verify the usage and impact:
Consider documenting the behavioral differences between canBeSubclassOf
and canBeConvertedTo
in the codebase, particularly:
- How this affects static method resolution
- Any edge cases where the behavior might differ
- Performance implications, if any
✅ Verification successful
Change from canBeSubclassOf
to canBeConvertedTo
is correct and consistent
The verification shows that the change is part of a broader architectural shift, and the implementation is correct for the following reasons:
- The
isClass
check is used specifically invalidateMethodsNonEmpty
to handle static method resolution, where we need to determine if a type can be treated as a Class. - The context shows that when
isClass
is true and no methods are found (case Nil if isClass
), we returnLeft(None)
to skip further validations, which is the desired behavior for static method resolution. - The change aligns with the documented migration in
MigrationGuide.md
whereTypingResult.canBeSubclassOf
was changed toTypingResult.canBeConvertedTo
. - Test cases in
TypingResultSpec
demonstrate thatcanBeConvertedTo
properly handles type conversion scenarios, including Class type validation.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for other occurrences of canBeSubclassOf that might need similar updates
rg "canBeSubclassOf"
# Search for test cases involving static method resolution
rg -A 5 "isStatic.*true"
Length of output: 3258
Script:
#!/bin/bash
# Let's check for static method resolution tests and usages
rg -A 10 "validateMethodsNonEmpty|isClass" scenario-compiler/
# Let's also check for tests involving Class type validation
ast-grep --pattern 'class $_ {
$$$
test("$_") {
$$$
Typed[Class[_]]
$$$
}
$$$
}'
# Check for other usages of canBeConvertedTo with Class
rg -A 3 "canBeConvertedTo.*Class"
Length of output: 10377
engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala (1)
29-29
: Verify numeric conversion edge cases
The change from subclass checking to convertibility checking aligns with the PR objectives. However, we should verify the behavior with edge cases to ensure no precision loss or overflow issues.
✅ Verification successful
Let me analyze the results and generate a final response.
Numeric conversion is properly handled with comprehensive type safety
The codebase shows a robust implementation of numeric conversions:
- All numeric conversions are handled through Spring's NumberUtils across the codebase
- The implementation covers all standard numeric types (Byte, Short, Integer, Long, Float, Double, BigInteger, BigDecimal)
- Error handling is in place with Try/Either wrappers to catch conversion failures
- The MathUtils class provides type promotion and safe conversion between different numeric types
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for test cases covering numeric conversions
rg -A 5 "NumberUtils.convertNumberToTargetClass" --type scala
Length of output: 16361
utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala (1)
132-132
: LGTM! Verify type conversion scenarios.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR objectives and should fix the incorrect subtype issues. This is a good improvement that makes type determination more flexible and accurate.
Let's verify the type conversion scenarios mentioned in the PR objectives:
✅ Verification successful
Let's verify the specific implementation and test coverage for the numeric type conversion functionality.
Change from canBeSubclassOf
to canBeConvertedTo
is correctly implemented and well-tested
The verification shows that:
- The
ToNumberTypingFunction
implementation correctly handles both String and Number input types - Comprehensive test coverage exists in
NumericUtilsSpec.scala
that verifies:- String to integer conversion (42 -> Integer)
- String to long conversion (21355252552 -> Long)
- String to double conversion (53.32 -> Double)
- Large number conversions
- The type conversion behavior is consistent with the numeric type promotion strategy
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for test cases covering Integer/Long/Number type conversions
rg -l "(?i)(integer.*long|long.*integer|number.*long|long.*number)" --type scala
Length of output: 4492
Script:
#!/bin/bash
# Check the implementation and tests for ToNumberTypingFunction
rg -A 10 "class ToNumberTypingFunction" --type scala
# Check test coverage for numeric type conversions
rg -A 10 "(?i)test.*number.*conversion|test.*to.*number" --type scala
Length of output: 11086
designer/server/src/main/scala/pl/touk/nussknacker/ui/process/test/ScenarioTestService.scala (2)
10-10
: LGTM: Import changes align with the new type system approach
The replacement of CanBeSubclassDeterminer with AssignabilityDeterminer is consistent with the PR's objective to improve type handling.
137-137
: LGTM: Type check enhancement improves flexibility
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the new type system approach, allowing for more flexible type conversions while maintaining type safety.
Let's verify other similar changes in the codebase to ensure consistency:
✅ Verification successful
Type conversion change is part of a broader refactoring
The codebase search reveals this is part of a systematic change where canBeSubclassOf
is being replaced with canBeConvertedTo
across the codebase. The remaining instances of canBeSubclassOf
are either in test files or have TODO comments indicating they should be migrated. The new canBeConvertedTo
method is consistently used for type compatibility checks, with comprehensive test coverage in TypingResultSpec
validating various type conversion scenarios.
Key findings:
- The change aligns with the type system's evolution, as evidenced by TODO comments in
AssignabilityDeterminer.scala
- The implementation in
typing.scala
showscanBeConvertedTo
usesAssignabilityDeterminer.isAssignableLoose
- Test coverage confirms the new behavior handles various type scenarios correctly
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for any remaining instances of canBeSubclassOf to ensure complete migration
rg "canBeSubclassOf" --type scala
# Search for new usage patterns of canBeConvertedTo to verify consistent usage
rg "canBeConvertedTo" --type scala -A 2
Length of output: 52200
designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala (2)
79-92
: LGTM! Test case properly validates Long type dictionary retrieval.
The test correctly verifies that only the 'long_dict' dictionary is returned for Long type, which aligns with the new stricter type handling approach.
20-36
: LGTM! Test case validates the new type assignability behavior.
The test case correctly verifies that no dictionaries are returned for Integer type, aligning with the PR's goal to fix incorrect subtype returns.
Let's verify the test coverage for other numeric types:
✅ Verification successful
Test coverage for numeric types is comprehensive and correct
The test file shows complete coverage for numeric type handling with test cases for:
- Integer (returns empty list)
- Long (returns long_dict)
- BigInteger (returns long_dict)
The test cases properly verify the new assignability behavior where:
- Integer type returns no dictionaries (preventing incorrect subtype returns)
- Long and BigInteger types return the appropriate long_dict
- Each test case validates both the HTTP status and the expected response body
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for other numeric type test cases to ensure comprehensive coverage
rg -A 2 "expectedType.*(?:Integer|Long|Double|Float|BigDecimal|BigInteger)" designer/server/src/test/scala/
Length of output: 109
Script:
#!/bin/bash
# Let's try a different approach to find numeric type test cases
rg -l "expectedType.*type.*TypedClass" designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
# If found, let's see the content
rg -B 2 -A 10 '"expectedType".*"type".*"TypedClass"' designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
Length of output: 269
Script:
#!/bin/bash
# Let's try a simpler pattern to find the test file content
cat designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
Length of output: 7151
engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala (1)
77-80
: LGTM! The changes align with the new assignability-based type system.
The implementation correctly handles both direct type conversion and TypedObjectTypingResult cases, maintaining the test suite's ability to verify type compatibility while adopting the new assignability-based approach.
Let's verify that all test cases still pass with the new type checking logic:
✅ Verification successful
Let me verify if the changes in AggregatesSpec.scala
are consistent with the ongoing migration from canBeSubclassOf
to canBeConvertedTo
. I'll check the specific file and its dependencies.
Let me check the implementation of canBeConvertedTo
to ensure it properly handles the type conversion cases we're testing.
Let me check how this test is used in the context of aggregations to ensure the type checking changes are appropriate.
Based on the evidence gathered, I can now provide the final response:
The changes to shouldBeInstanceOf
are correctly implemented and aligned with the type system migration
The implementation in AggregatesSpec.scala
properly validates type compatibility for aggregator test cases by:
- Using
canBeConvertedTo
for direct type conversion checks - Handling special cases for
TypedObjectTypingResult
with runtime object type verification - Maintaining the test's ability to verify both stored and output types of aggregations
The changes are consistent with the broader codebase migration from canBeSubclassOf
to canBeConvertedTo
, while the test coverage remains intact for verifying aggregator type compatibility.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Search for any remaining uses of canBeSubclassOf in test files
# to ensure complete migration to canBeConvertedTo
rg "canBeSubclassOf" --type scala
Length of output: 2660
Script:
#!/bin/bash
# Check the full context of AggregatesSpec.scala to understand the test coverage
rg "canBeConvertedTo" engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala -A 2 -B 2
# Check if there are any remaining canBeSubclassOf in the same test file
rg "canBeSubclassOf" engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
Length of output: 680
Script:
#!/bin/bash
# Find the implementation of canBeConvertedTo
ast-grep --pattern 'def canBeConvertedTo'
# Check for test cases using shouldBeInstanceOf
rg "shouldBeInstanceOf" engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala -B 2 -A 5
Length of output: 940
Script:
#!/bin/bash
# Find test cases that use this type checking in aggregations
rg -B 10 -A 2 "shouldBeInstanceOf" engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
# Look for the Aggregator trait/class definition to understand the type requirements
ast-grep --pattern 'trait Aggregator'
Length of output: 1084
utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala (1)
155-159
: LGTM: Type validation enhancement aligns with PR objectives
The change from canBeSubclassOf
to canBeConvertedTo
for map key validation improves type handling flexibility while maintaining type safety. This aligns with the PR's goal of enhancing type assignability determination.
Let's verify the impact of this change on map key type validation:
✅ Verification successful
Let's gather more specific information about the type validation changes and their impact:
Based on the gathered information, I can now provide the final response:
Verified: Map key type validation change is correct and well-supported
The change from canBeSubclassOf
to canBeConvertedTo
is verified to be correct because:
- The implementation aligns with Avro's specification that map keys must be strings (as noted in the code comment)
- The
canBeConvertedTo
method is properly defined in the typing system with loose assignability checks - The validation maintains type safety while allowing for more flexible type conversion scenarios
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Search for map-related test cases and type conversion scenarios
# to ensure the change doesn't break existing functionality
# Look for test cases involving map key type validation
rg -l "validateMapSchema|canBeConvertedTo.*String" --type scala
# Search for any potential map key type conversion usage
ast-grep --pattern 'Map[$_][$_]' --lang scala
Length of output: 626
Script:
#!/bin/bash
# Check the test cases and implementation details
# Look for test cases specifically related to map validation in AvroSchemaOutputValidator
rg -A 10 "validateMapSchema" utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala
# Check for any existing tests that verify map key type conversion
rg -A 10 "Map.*String" utils/schemed-kafka-components-utils/src/test
# Look for the definition and usage of canBeConvertedTo method
rg -A 5 "def canBeConvertedTo" --type scala
Length of output: 9788
utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala (1)
373-373
: LGTM! Consider updating documentation and tests.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR objectives and should fix the issues with incorrect subtype handling (e.g., Long vs Integer). However, a few follow-up items to consider:
- The TODO comment above the method still refers to the old
.canBeSubclassOf
behavior and should be updated to reflect the new approach. - Consider documenting the behavior change, especially how it handles edge cases differently from the previous implementation.
Let's verify test coverage for the edge cases mentioned in the PR objectives:
engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala (2)
242-242
: LGTM: Enhanced numeric type handling in AverageAggregator
The change to use canBeConvertedTo
with Number type improves type compatibility, working correctly with the ForLargeFloatingNumbersOperation promotion strategy for averaging calculations.
193-193
: LGTM: Type check improvement in CountWhenAggregator
The change from subclass checking to convertibility checking is appropriate here, as it will handle both primitive boolean and Boolean object types correctly.
Let's verify the Boolean type handling:
✅ Verification successful
Change from canBeSubclassOf
to canBeConvertedTo
for Boolean type checking is correct
The verification shows that:
- The change aligns with the existing type checking patterns in the codebase, particularly in SPEL typer and expression parsers
canBeConvertedTo
is the preferred method for type validation across the codebase, with only one legacy usage ofcanBeSubclassOf
for Boolean- The implementation in
typing.scala
confirms thatcanBeConvertedTo
is the canonical method for type compatibility checks
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for Boolean type usage patterns to ensure compatibility
rg -l "Boolean|boolean" | xargs ast-grep --pattern 'class $_ {
$$$
$x: Boolean
$$$
}'
Length of output: 93
Script:
#!/bin/bash
# Let's try a different approach to analyze Boolean type handling
# First, let's look for the context around the changed line
rg -B 5 -A 5 "canBeConvertedTo.*Boolean" engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala
# Also search for other Boolean type validations in the codebase
rg "canBeConvertedTo.*Boolean|canBeSubclassOf.*Boolean" --type scala
Length of output: 1998
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala (5)
35-61
: LGTM! Basic type conversion tests are comprehensive.
The test cases cover essential scenarios for record type conversions, including:
- Field subset compatibility
- Field superset incompatibility
- Type compatibility within fields
- Nested record compatibility
- Generic map compatibility
306-314
: LGTM! Tagged value conversion tests are well-structured.
The tests thoroughly verify tag compatibility and type safety for tagged values.
318-321
: LGTM! Null type conversion tests cover critical cases.
The tests properly verify null compatibility with both primitive and reference types.
349-353
: LGTM! Date/Time conversion tests are comprehensive.
The tests properly verify string-to-date conversions with both valid and invalid formats.
133-141
: LGTM! Array type conversion tests are thorough.
The tests properly verify array type variance and compatibility rules.
utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala (1)
359-361
: LGTM! Type conversion logic updated as per requirements.
The change from subclass checking to assignability checking aligns with the PR objectives and should correctly handle cases where numeric types need conversion rather than strict subclass relationships.
Let's verify the numeric type handling:
✅ Verification successful
Based on the shell script results, I can now generate the final response. The output shows consistent usage of canBeConvertedTo
for numeric type handling across the codebase, particularly in:
- The type conversion logic in
TypeConversionHandler.scala
- Numeric operations in
Typer.scala
- Collection type handling in
collection.scala
- Test cases in
TypingResultSpec.scala
validating numeric type conversions
Type conversion change is consistent with codebase patterns
The change from subclass checking to canBeConvertedTo
for numeric types aligns with the established patterns in the codebase, where numeric type conversions are consistently handled using canBeConvertedTo
checks against Number
or specific numeric types. The test suite demonstrates this pattern works correctly for various numeric type scenarios.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for other instances of numeric type conversion in the codebase
# to ensure consistent approach across the codebase
# Search for numeric type conversion patterns
ast-grep --pattern 'canBeConvertedTo($_, numberType)'
# Search for potential numeric type handling
rg -l 'Number.*Type' | xargs rg -A 3 'canBeConvertedTo|isAssignable'
Length of output: 48054
...onents-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
Show resolved
Hide resolved
...er/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala
Show resolved
Hide resolved
...ts-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala
Show resolved
Hide resolved
designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
Show resolved
Hide resolved
designer/server/src/test/scala/pl/touk/nussknacker/ui/validation/UIProcessValidatorSpec.scala
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 10
🧹 Outside diff range and nitpick comments (32)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala (1)
25-26
: Improve grammar in comments for clarityThe comments in lines 25-26 contain grammatical errors that may impact readability. Consider rephrasing them for better clarity:
- * Given to this we cannot be sure if conversion is safe or not based on type (without scale knowledge). + * Given this, we cannot be sure if conversion is safe or not based on type (without scale knowledge). - * So we have two options: force user to convert to some type without floating point (e.g. BigInteger) or be loose in this point. + * So we have two options: force the user to convert to some type without a floating point (e.g., BigInteger), or be lenient at this point.components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala (5)
9-18
: Fix grammatical errors in the class documentationThere are minor grammatical errors in the documentation comments that should be corrected for clarity:
- Line 10: "This class determine" should be "This class determines".
- Line 10: "if its the same class" should be "if it's the same class".
39-39
: Correct the typo in method name 'isNullAsignableTo'The method name
isNullAsignableTo
appears to have a typo. It should beisNullAssignableTo
to maintain consistent spelling.Apply this diff to correct the method name:
- case (TypedNull, other) => isNullAsignableTo(other) + case (TypedNull, other) => isNullAssignableTo(other) ... - private def isNullAsignableTo(to: TypingResult): ValidatedNel[String, Unit] = to match { + private def isNullAssignableTo(to: TypingResult): ValidatedNel[String, Unit] = to match {Also applies to: 52-52
166-166
: Correct the typo in error message: 'are not equals'The error message contains a grammatical error. It should be 'are not equal' instead of 'are not equals'.
Apply this diff to correct it:
- s"Key types of Maps ${givenKeyParam.display} and ${superclassKeyParam.display} are not equals" + s"Key types of Maps ${givenKeyParam.display} and ${superclassKeyParam.display} are not equal"
185-185
: Replace unnecessary 'f' interpolator with 's' interpolatorThe 'f' string interpolator is used without any format specifiers, which may lead to unintended behavior. It's better to use the 's' interpolator in these cases.
Lines affected:
- Line 185:
f"${givenClass.display} and ${to.display} are not the same"
- Line 233:
f"${from.runtimeObjType.display} cannot be strictly converted to ${to.display}"
Apply this diff to correct it:
- f"${givenClass.display} and ${to.display} are not the same" + s"${givenClass.display} and ${to.display} are not the same" ... - f"${from.runtimeObjType.display} cannot be strictly converted to ${to.display}" + s"${from.runtimeObjType.display} cannot be strictly converted to ${to.display}"Also applies to: 233-233
198-198
: Fix typo in comment: 'we wil protect'In the comment, 'we wil protect' should be corrected to 'we will protect'.
Apply this diff:
- // Would be more safety to do givenTypes.forAll(... superclassCandidates.exists ...) - we wil protect against + // It would be safer to do givenTypes.forAll(... superclassCandidates.exists ...) - we will protect againstcomponents-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala (1)
18-21
: Consider documenting the fallback strategy.The fallback logic using
canBeConvertedTo
handles important edge cases, but its purpose might not be immediately clear to future maintainers.Consider adding comments to explain the strategy:
else if (typ == Typed[java.math.BigDecimal]) java.math.BigDecimal.ZERO - // in case of some unions + // Handle union types and cases where Integer conversion is safe else if (typ.canBeConvertedTo(Typed[java.lang.Integer])) java.lang.Integer.valueOf(0) - // double is quite safe - it can be converted to any Number + // Use Double as a safe fallback since it can represent any numeric value with acceptable precision else if (typ.canBeConvertedTo(Typed[Number])) java.lang.Double.valueOf(0) else throw new IllegalArgumentException(s"Not expected type: ${typ.display}, should be Number")engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala (1)
Line range hint
20-29
: Consider enhancing error messages for type conversion failures.The current error message uses
expectedTypingResult.display
, but as noted in the comment, there might be better alternatives. Consider usingLogicalType.toString
or providing more context about why the conversion failed, especially for cases involving numeric type conversions (Long/Integer/Number).OutputValidatorTypeError( None, actualType, new OutputValidatorExpected { - // Maybe we should use LogicalType.toString instead? - override def expected: String = expectedTypingResult.display + override def expected: String = { + val basicExpected = expectedTypingResult.display + val typeDetails = expectedType.toString + s"$basicExpected (LogicalType: $typeDetails)" + } } )engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala (1)
47-52
: Consider adding detailed documentation about type conversion behavior.While the implementation is correct, it would be helpful to add documentation explaining:
- The expected behavior when dealing with different collection types
- Why Unknown is returned in the default case
- Examples of valid and invalid type conversions
This would improve maintainability and help other developers understand the type handling logic.
Here's a suggested documentation addition:
private def returnType(elements: LazyParameter[util.Collection[AnyRef]]): typing.TypingResult = + /** + * Determines the element type of a Collection parameter. + * For valid collection types (e.g., List[Long], Set[String]), returns the element type. + * Returns Unknown for non-collection types or collections without type parameters. + * + * @param elements The collection parameter to analyze + * @return The type of elements in the collection, or Unknown if not determinable + */ elements.returnType match {components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala (2)
13-35
: Consider adding more comprehensive widening test casesWhile the current test cases cover basic widening scenarios well, consider adding:
- Primitive to boxed type conversions (e.g.,
Int
tojava.lang.Integer
)- Class inheritance scenarios (e.g.,
SubClass
toParentClass
)- Tests verifying specific error messages in
Invalid
casesExample additions:
(Typed[Int], Typed[java.lang.Integer], Valid(()), Valid(())), (Typed[String], Typed[java.lang.Object], Valid(()), Valid(())), // Assuming you have test classes like: // class Parent; class Child extends Parent (Typed[Child], Typed[Parent], Valid(()), Valid(()))
45-65
: Enhance narrowing conversion test coverageConsider adding:
- Edge cases for numeric conversions (e.g., values at type boundaries)
- More complex narrowing scenarios (e.g., generic type parameter narrowing)
- Specific assertions about error message content
Example additions:
// Add to narrowingConversionCases (Typed[List[Any]], Typed[List[String]], Invalid(NonEmptyList.of("Cannot strictly narrow generic type parameter from Any to String")), Valid(())), (Typed[Double], Typed[Int], Invalid(NonEmptyList.of("Cannot strictly convert Double to Int due to decimal truncation")), Valid(()))Also, consider strengthening the result verification:
result match { case Invalid(errors) if expectedStrict.isInvalid => errors.head should include("Cannot strictly convert") case Valid(_) if expectedStrict.isValid => succeed case _ => fail(s"Unexpected result: $result for types $sourceType -> $targetType") }components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala (3)
Line range hint
19-41
: LGTM! Consider adding documentation for type conversion rulesThe test case effectively validates type compatibility for nested structures. Consider adding a comment block explaining the rules for loose assignability, especially for nested structures, to help other developers understand the expected behavior.
+ /** + * Tests loose assignability rules for nested record structures: + * - Field type compatibility is checked recursively + * - Missing fields in the target type are allowed + * - Extra fields in the source type are allowed + */ test("determine if can be subclass for simple typed objects") {
54-57
: LGTM! Consider adding more generic type scenariosThe test effectively validates Set type compatibility. Consider adding more test cases for other common generic types (List, Map) to ensure comprehensive coverage of generic type parameter compatibility.
70-73
: LGTM! Consider adding edge casesThe test effectively validates value type compatibility. Consider adding edge cases like comparing different numeric types (Int vs Long) or testing with null values.
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala (1)
Line range hint
1-90
: Consider renaming the checker class for consistencyThe class name
MethodTypeInfoSubclassChecker
still reflects the old subclass-based approach. Since we're moving to an assignability-based model, consider renaming it to better reflect its new purpose.Consider renaming to
MethodTypeInfoAssignabilityChecker
orMethodTypeInfoCompatibilityChecker
to better align with its current responsibility. This would make the codebase more maintainable and self-documenting.designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala (1)
19-20
: Consider using specific imports instead of wildcards.While wildcard imports provide flexibility, they can make it harder to track which specific components are being used. Consider explicitly importing only the required symbols from
sttp.tapir.json.circe._
.-import sttp.tapir.json.circe._ +import sttp.tapir.json.circe.{jsonBody, jsonBodyWithRawSchema}scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala (1)
Line range hint
99-107
: Consider enhancing error handling for type mismatches.While the type checking logic is correct, throwing an
AssertionError
might not be the best approach for handling type mismatches in a production environment.Consider:
- Creating a custom exception type that better represents this specific error case
- Using the existing
GenericFunctionTypingError
hierarchy instead of throwing an exception- if (!typesFromStaticMethodInfo.exists(calculated.canBeConvertedTo)) { - val expectedTypesString = typesFromStaticMethodInfo.map(_.display).mkString("(", ", ", ")") - val argumentsString = arguments.map(_.display).mkString("(", ", ", ")") - throw new AssertionError( - s"Generic function $name returned type ${calculated.display} that does not match any of declared types $expectedTypesString when called with arguments $argumentsString" - ) - } + if (!typesFromStaticMethodInfo.exists(calculated.canBeConvertedTo)) { + val expectedTypesString = typesFromStaticMethodInfo.map(_.display).mkString("(", ", ", ")") + val argumentsString = arguments.map(_.display).mkString("(", ", ", ")") + GenericFunctionTypingError.ReturnTypeError( + expected = expectedTypesString, + actual = calculated.display, + context = s"Generic function $name called with arguments $argumentsString" + ).invalidNel + } else { + calculated.validNel + }scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala (2)
Line range hint
95-97
: Update error message to reflect new type checking approachThe error message still refers to "subclass" which is inconsistent with the new assignability-based approach.
- s"DictKeyExpression expected type: ${expectedType.display} is unsupported. It must be a subclass of Long, Boolean or String" + s"DictKeyExpression expected type: ${expectedType.display} is unsupported. It must be convertible to Long, Boolean or String"
Line range hint
13-26
: Inconsistency between typingResult and evaluate methodsThe
typingResult
method inDictKeyWithLabelExpressionTypingInfo
still usescanBeSubclassOf
while theevaluate
method has been updated to usecanBeConvertedTo
. This inconsistency could lead to type mismatches.override def typingResult: TypingResult = expectedType match { - case clazz: TypedClass if clazz.canBeSubclassOf(Typed[Long]) && Try(key.toLong).toOption.isDefined => + case clazz: TypedClass if clazz.canBeConvertedTo(Typed[Long]) && Try(key.toLong).toOption.isDefined => TypedObjectWithValue(clazz.runtimeObjType, key.toLong) - case clazz: TypedClass if clazz.canBeSubclassOf(Typed[Boolean]) && Try(key.toBoolean).toOption.isDefined => + case clazz: TypedClass if clazz.canBeConvertedTo(Typed[Boolean]) && Try(key.toBoolean).toOption.isDefined => TypedObjectWithValue(clazz.runtimeObjType, key.toBoolean) - case clazz: TypedClass if clazz.canBeSubclassOf(Typed[String]) => + case clazz: TypedClass if clazz.canBeConvertedTo(Typed[String]) => TypedObjectWithValue(clazz.runtimeObjType, key) case _ => expectedType }engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala (1)
65-65
: LGTM! Consider enhancing the error message.The change from
canBeSubclassOf
tocanBeConvertedTo
aligns with the PR objectives and should fix the issues with numeric type handling (e.g., Long/Integer conversions).Consider making the error message more specific about type conversion:
- OtherError(s"Addition of ${l.display} and ${r.display} is not supported").invalidNel + OtherError(s"Addition of ${l.display} and ${r.display} is not supported. Both types must be exactly Int or Double").invalidNelscenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala (1)
Line range hint
63-71
: Add documentation for the assignability-based type checking.Consider adding documentation to explain:
- The rationale for using
canBeConvertedTo
instead of subclass checking- How the assignability check handles edge cases differently from the previous implementation
- Examples of valid/invalid cases that work differently with the new approach
+ // We use assignability-based checking instead of subclass checking to handle cases where + // types are convertible but not necessarily in a subclass relationship (e.g., numeric types). + // This is particularly important for static method resolution where we need to determine if + // a type can be treated as a Class regardless of inheritance hierarchy. def isClass = clazzDefinitions.map(k => k.clazzName).exists(_.canBeConvertedTo(Typed[Class[_]]))engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala (1)
85-86
: LGTM! Consider improving readability.The double check for both
Number
compatibility and target type conversion is a good safety measure. Consider reformatting for better readability:- case (typ: SingleTypingResult, _) - if typ - .canBeConvertedTo(Typed[Number]) && typ.canBeConvertedTo(targetType.toTypingResult) => + case (typ: SingleTypingResult, _) if typ.canBeConvertedTo(Typed[Number]) && + typ.canBeConvertedTo(targetType.toTypingResult) =>designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala (1)
19-19
: Consider adding test cases for loose assignabilityThe PR introduces both "strict" and "loose" conversion logic, but the test suite only appears to verify strict assignability. Consider adding test cases that explicitly verify the loose assignability behavior to ensure complete coverage of the new functionality.
engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala (1)
77-78
: LGTM! Consider adding a clarifying comment.The change from
canBeSubclassOf
tocanBeConvertedTo
aligns well with the PR's objective of improving type handling. The logic correctly handles both direct type conversion and special cases forTypedObjectTypingResult
.Consider adding a brief comment explaining the two cases being checked:
val typeFromInstance = Typed.fromInstance(obj) + // Check both direct type conversion and runtime object type for TypedObjectTypingResult val canBeSubclassCase = typeFromInstance.canBeConvertedTo(typ) val typedObjectCase = typ.isInstanceOf[TypedObjectTypingResult] && typeFromInstance.canBeConvertedTo( typ.asInstanceOf[TypedObjectTypingResult].runtimeObjType )
utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala (2)
Line range hint
368-371
: Consider addressing the TODO commentThe TODO comment suggests that the verification logic needs review, particularly regarding
Unknown.canBeSubclassOf(Any)
. With the transition tocanBeConvertedTo
, this comment should be updated or removed if the concern is no longer relevant.Would you like me to help draft an updated comment that reflects the new assignability-based approach?
Line range hint
379-397
: Consider enhancing numeric type handling documentationThe comment about BigDecimal as a fallback type is important for understanding the type hierarchy. Consider expanding it to explicitly mention the new assignability-based approach and how it affects numeric type conversions.
/** * we are mapping json schema types to java strict class hierarchy and due to dual definition of `number` type (as integer | double) in json schema we treat BigDecimal class * as a fallback type, which can accommodate all other numeric type. + * + * With the assignability-based approach: + * - Integer can be converted to Long + * - Any numeric type can be converted to BigDecimal + * - Strict vs loose conversion rules apply based on validation mode */engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala (1)
356-358
: Improve indentation while maintaining correct type checkingThe change to use
canBeConvertedTo
is correct and consistent with the new assignability approach. However, the indentation could be improved for better readability.Consider adjusting the indentation:
- if inputFields.keySet == scalaFields.keySet && klass.canBeConvertedTo( - Typed[java.util.Map[String, _]] - ) => + if inputFields.keySet == scalaFields.keySet && + klass.canBeConvertedTo(Typed[java.util.Map[String, _]]) =>components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala (2)
79-82
: Consider adding more union type conversion edge cases.While the current tests cover basic union type conversions, consider adding tests for:
- Union types with different type parameters
- Nested union types
- Union types with null
145-150
: Consider adding decimal precision tests for number conversions.While the current tests cover basic number type conversions, consider adding tests for:
- Decimal precision loss scenarios
- BigDecimal to Double conversion edge cases
- Currency/monetary value conversions
designer/server/src/test/scala/pl/touk/nussknacker/ui/validation/UIProcessValidatorSpec.scala (1)
1953-1953
: LGTM! The parameter type change aligns with PR objectives.The change from
FragmentClazzRef[Short]
toFragmentClazzRef[Integer]
helps test the new assignability determiner's behavior with different numeric types.Consider adding more test cases to verify:
- Edge cases where numeric types are at their boundaries
- Other numeric type combinations (e.g., Long to Integer, Float to Double)
- Negative cases where type conversion should fail
docs/MigrationGuide.md (1)
81-82
: Fix markdown list indentationThe list item is indented with 4 spaces but should be indented with 2 spaces to match the style guide.
- * `DictListRequestDto` `expectedType`: TypingResultInJson -> Json + * `DictListRequestDto` `expectedType`: TypingResultInJson -> Json🧰 Tools
🪛 Markdownlint (0.35.0)
82-82: Expected: 2; Actual: 4
Unordered list indentation(MD007, ul-indent)
docs-internal/api/nu-designer-openapi.yaml (1)
Line range hint
1-5084
: Consider enhancing API documentationWhile the documentation is generally good, consider these improvements:
- Add more detailed descriptions for complex endpoints, especially those handling type validation
- Include more examples covering error cases and edge scenarios
- Add markdown descriptions explaining the type conversion logic introduced by this PR
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (38)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
(1 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/CanBeSubclassDeterminer.scala
(0 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala
(1 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala
(5 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala
(2 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala
(1 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala
(2 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala
(4 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala
(7 hunks)designer/client/src/http/HttpService.ts
(1 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
(1 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala
(2 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/process/test/ScenarioTestService.scala
(2 hunks)designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
(2 hunks)designer/server/src/test/scala/pl/touk/nussknacker/ui/validation/UIProcessValidatorSpec.scala
(1 hunks)docs-internal/api/nu-designer-openapi.yaml
(1 hunks)docs/MigrationGuide.md
(2 hunks)engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
(1 hunks)engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala
(4 hunks)engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala
(1 hunks)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala
(1 hunks)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala
(3 hunks)engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala
(2 hunks)engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala
(1 hunks)engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/compile/nodecompilation/FragmentParameterValidator.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionValidator.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala
(11 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala
(1 hunks)utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala
(1 hunks)utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala
(1 hunks)utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala
(1 hunks)utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala
(1 hunks)
💤 Files with no reviewable changes (1)
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/CanBeSubclassDeterminer.scala
🧰 Additional context used
🪛 Markdownlint (0.35.0)
docs/MigrationGuide.md
82-82: Expected: 2; Actual: 4
Unordered list indentation
(MD007, ul-indent)
🔇 Additional comments (74)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala (1)
105-115
: Ensure consistent ordering of AllNumbers
and document the reliance
In handleStrictNumberConversions
, the logic relies on the ordering of AllNumbers
to determine type assignability. If the order of AllNumbers
changes, it could lead to incorrect behavior. Please verify that AllNumbers
is ordered from the most specific to the most general numeric types and consider adding documentation to explain this dependency.
Run the following script to verify the ordering of AllNumbers
:
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala (1)
144-144
: Update to canBeConvertedTo
enhances type flexibility
The change from canBeSubclassOf
to canBeConvertedTo
in the deduplication logic allows for a more flexible type assignability check, which aligns with the new AssignabilityDeterminer
. This should improve type compatibility during method deduplication.
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala (3)
30-36
: Approved: Addition of canBeConvertedTo
method.
The method and its documentation are clear and align with the intended functionality.
37-41
: Approved: Addition of canBeStrictlyConvertedTo
method.
The method provides a clear way to check conversions without loss of precision.
472-474
: Verify impact of updating CastTypedValue
unapply method.
Changing the unapply
method to use canBeConvertedTo
may affect pattern matching behavior elsewhere in the codebase. Ensure this change aligns with the intended functionality and does not introduce unintended side effects.
Run the following script to find usages of CastTypedValue
and verify that pattern matching behaves as expected:
✅ Verification successful
Let me search for more context about the usage of CastTypedValue
in pattern matching scenarios.
Based on the search results, I can now generate the final response:
No impact from updating CastTypedValue
unapply method
The change to use canBeConvertedTo
in the CastTypedValue
unapply method is safe. The only usage of CastTypedValue
in pattern matching is in Typer.scala
for type checking operations, where it's used to verify type compatibility, which aligns perfectly with the canBeConvertedTo
check. The method is already used extensively throughout the codebase for type compatibility verification in similar contexts.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find all Scala files using `CastTypedValue` in pattern matching.
# Search for pattern matching cases using `CastTypedValue`
rg -A 2 -B 2 'case.*CastTypedValue' --type=scala
Length of output: 583
Script:
#!/bin/bash
# Search for pattern matching using CastTypedValue in a broader context
rg -A 5 -B 5 'CastTypedValue' --type=scala
# Also search for any references to canBeConvertedTo to understand the impact
rg -A 3 -B 3 'canBeConvertedTo' --type=scala
Length of output: 88792
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala (13)
149-150
: Approved: Type validation logic is correct
The code correctly checks that all elements can be converted to the expected type and handles errors appropriately.
201-201
: Approved: Dynamic property access handling is appropriate
The code properly handles dynamic property access based on the configuration.
359-362
: Approved: Numeric subtraction with type promotion
The subtraction operation correctly promotes number types and performs the subtraction.
368-371
: Approved: Unary negation of numeric types
The unary minus operation is correctly implemented for numeric types.
398-402
: Approved: Handles string concatenation when one operand is a string
The addition operation correctly handles concatenation when one operand is a string.
402-405
: Approved: Numeric addition with type promotion
The addition operation correctly promotes number types and performs the addition.
409-412
: Approved: Unary plus operation returns the original number
The unary plus operation correctly returns the original numeric value.
451-453
: Approved: Selection condition requires a boolean expression
The selection operation correctly validates that the condition is a boolean.
462-465
: Approved: Ternary operator condition validation
The ternary operator correctly ensures that the condition is a boolean.
523-525
: Approved: Handles selection on collections or arrays
The code correctly processes selection operations on collections and arrays.
526-528
: Approved: Handles selection on map types
The code correctly processes selection operations on map types by returning an empty record.
693-695
: Approved: Extracts element type from collections or arrays
The code correctly extracts the element type from collections and arrays.
Line range hint 696-705
: Approved: Extracts key and value types from map types
The code correctly extracts key and value types from map types and constructs a typed record.
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala (1)
18-18
: LGTM! The changes align with the PR objectives.
The switch from canBeSubclassOf
to canBeConvertedTo
improves type handling accuracy, particularly for cases where strict subclass relationships don't capture the desired type conversion behavior (e.g., Long for Integer).
Also applies to: 20-20
engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala (1)
17-17
: Verify backward compatibility of the new type conversion logic.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR objectives to fix type handling issues. However, we should verify that this change doesn't break existing valid type conversions.
✅ Verification successful
Let me gather more information about the type conversion tests to make a more informed decision.
Let me search for tests specifically related to TableTypeOutputValidator to better understand the backward compatibility implications.
Based on the analysis of the code and search results, I can now generate the final response regarding the change from canBeSubclassOf
to canBeConvertedTo
.
The change from canBeSubclassOf
to canBeConvertedTo
maintains backward compatibility and improves type handling.
The change is safe because:
canBeConvertedTo
is already used throughout the codebase for type compatibility checks, including in the table module's type handling- The change is part of the ToTableTypeSchemaBasedEncoder which already uses
canBeConvertedTo
for type validation - The implementation in TableTypeOutputValidator is consistent with other type validation patterns in the codebase
The change actually improves type handling by:
- Supporting more flexible type conversions while maintaining type safety
- Aligning with the existing type system's behavior in other parts of the codebase
- Following the PR's objective to fix type handling issues
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Search for test cases that verify type conversion behavior
# This will help ensure that existing valid type conversions still work
# Look for relevant test cases
rg -A 10 "TableTypeOutputValidator.*validate" --type scala
# Look for existing type conversion tests
ast-grep --pattern 'class $_(Test|Spec) {
$$$
test("$_") {
$$$
canBeConvertedTo
$$$
}
$$$
}'
Length of output: 3511
Script:
#!/bin/bash
# Search for test files related to TableTypeOutputValidator
fd ".*Test.*scala$" --exec rg -l "TableTypeOutputValidator"
# Search for type conversion test cases
rg -B 5 -A 10 "canBeConvertedTo|canBeSubclassOf" --type scala
# Look for test files in the table sink directory
fd ".*Test.*scala$" "engine/flink/components/table"
Length of output: 131248
Script:
#!/bin/bash
# Search for TableTypeOutputValidator tests
fd ".*Test.*scala$" --exec rg -l "TableTypeOutputValidator"
# Look for test cases involving type conversion in the table module
rg -B 5 -A 10 "canBeConvertedTo|canBeSubclassOf" "engine/flink/components/table"
# Search for any test files in the table sink directory that might contain relevant tests
fd ".*Test.*scala$" "engine/flink/components/table/src/test"
Length of output: 12202
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionValidator.scala (2)
26-27
: LGTM! Change aligns with the new type handling approach.
The switch from canBeSubclassOf
to canBeConvertedTo
correctly implements the new assignability-based type checking strategy, which should resolve the issues with incorrect subtype handling mentioned in the PR objectives.
26-27
: Verify impact on existing type conversions.
Since this changes the fundamental type compatibility check, we should verify there are no breaking changes in existing valid type conversions.
✅ Verification successful
Based on the code analysis, I can now provide a final response:
Type conversion validation is working as expected and is well-tested.
The code shows that type conversion validation is handled properly through:
-
The
canBeConvertedTo
check is part of a well-structured validation flow that includes:- Special case handling for SpelExpressionRepr
- Special case handling for String to TemplateEvaluationResult
- General type compatibility check using
canBeConvertedTo
- Proper error reporting via
ExpressionTypeError
when conversion fails
-
The error handling is well-defined with clear error messages that show both expected and found types.
-
The functionality is covered by tests as seen in SpelExpressionSpec.scala which verifies type conversion behavior and error cases.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Search for test cases that verify type conversion behavior
# to ensure existing valid scenarios continue to work
# Look for test files related to SpelExpressionValidator
fd -e scala -e java test.*SpelExpressionValidator
# Search for type conversion test cases
rg -A 5 "validate.*expectedType" --type scala
# Look for existing usages that might be affected
ast-grep --pattern 'validate($$$, $$$, $$$)'
Length of output: 100788
Script:
#!/bin/bash
# Let's search for test cases specifically related to SpelExpressionValidator and type conversions
fd -e scala SpelExpressionValidator.*Test
rg -A 5 "canBeConvertedTo" --type scala
Length of output: 77095
Script:
#!/bin/bash
# Let's look for test cases specifically related to SpelExpressionValidator type conversions
fd -e scala SpelExpressionValidator.*Test
rg -B 5 -A 5 "canBeConvertedTo.*expectedType" --type scala
Length of output: 1594
Script:
#!/bin/bash
# Let's look for test cases that verify type conversion behavior
rg -B 5 -A 5 "ExpressionTypeError" --type scala
rg -B 5 -A 5 "SpelExpressionValidator.*validate.*expectedType" --type scala
Length of output: 8950
engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala (2)
45-47
: LGTM! Change aligns with the PR objectives.
The transition from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's goal of improving type handling, particularly for cases involving numeric types like Long and Integer.
45-47
: Verify type conversion behavior for collections.
Please ensure that the new type conversion logic handles all expected collection types correctly, especially for edge cases with numeric types (Long, Integer, Number).
✅ Verification successful
Let me gather more information about the type conversion implementation and tests.
Let me gather more specific information about the ForEachTransformer implementation and its type handling.
Based on the gathered information, I can now provide the final response:
Type conversion for collections is properly implemented and verified
The implementation correctly handles collection type conversions, including numeric types:
- The
canBeConvertedTo
check forjava.util.Collection[_]
is consistently implemented across the codebase - Type parameters are properly preserved during conversion
- Numeric type handling is verified in tests, with proper handling of Number supertype relationships for collection elements
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Search for test cases and usages to verify type conversion behavior
# Look for test cases covering collection type conversions
rg -t scala "test.*ForEach.*Collection" -A 10
# Look for specific numeric type handling in tests
ast-grep --pattern 'test("$_") { $$ Long $$ Integer $$ }'
# Find other instances of canBeConvertedTo with collections to ensure consistency
rg "canBeConvertedTo.*Collection"
Length of output: 665
Script:
#!/bin/bash
# Look for ForEachTransformer tests
rg -t scala "class.*ForEachTransformer.*Test" -A 20
# Find type conversion tests and implementations
rg -t scala "canBeConvertedTo.*Collection" -B 5 -A 5
# Look for collection type handling in ForEachTransformer
ast-grep --pattern 'class ForEachTransformer {
$$$
def $_($$) {
$$$
canBeConvertedTo
$$$
}
}'
# Search for collection type conversion tests
rg -t scala "test.*collection.*type" -A 10
Length of output: 7984
Script:
#!/bin/bash
# Look for ForEachTransformer implementation details
rg -t scala "class ForEachTransformer" -B 5 -A 20
# Find tests related to type conversion in ForEachTransformer
rg -t scala "ForEachTransformer.*Test.*type" -A 10
# Look for collection type parameter handling
ast-grep --pattern 'def validate($$$) {
$$$
runtimeObjType.params
$$$
}'
# Search for numeric type handling in collection context
rg -t scala "Collection.*Number|Number.*Collection" -B 2 -A 2
Length of output: 10935
engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala (1)
48-50
: LGTM! Type checking change aligns with the new assignability approach.
The change from (likely) canBeSubclassOf
to canBeConvertedTo
aligns well with the PR's objective of moving from subclass determination to assignability determination. This should help resolve the type handling issues mentioned in the PR description, particularly for cases involving numeric types.
Let's verify the impact of this change on type handling:
✅ Verification successful
Based on the shell script results, I can now generate the final response:
Type conversion change is consistent with codebase-wide refactoring
The change from canBeSubclassOf
to canBeConvertedTo
for Collection type checking is part of a documented migration (see MigrationGuide.md). The search results show:
- Similar
canBeConvertedTo
Collection checks are consistently used in other parts likeSpelExpressionSuggester.scala
andTyper.scala
- Remaining
canBeSubclassOf
usages are either in different contexts (e.g., Avro schema validation) or are documented for migration - The change aligns with PR [NU-1766] TypingResult: generic params canBeSubclassOf fix: collections covariance check #6570 which specifically addresses
TypingResult.canBeSubclassOf
tocanBeConvertedTo
migration for better generic parameter handling
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for similar type conversion patterns across the codebase
# to ensure consistent implementation of the new approach
# Look for other instances of canBeConvertedTo to verify consistency
rg "canBeConvertedTo.*Collection" -A 2
# Look for any remaining instances of canBeSubclassOf that might need updating
rg "canBeSubclassOf" -A 2
Length of output: 8466
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala (2)
1-11
: LGTM! Well-structured test setup
Good choice of testing frameworks and proper organization of imports. The use of property-based testing with Tables is particularly appropriate for testing type conversion cases.
1-67
: Verify test coverage metrics
The test structure and cases are well-designed, but let's verify the coverage metrics to ensure all paths in the AssignabilityDeterminer are tested.
✅ Verification successful
Test coverage is comprehensive and well-structured
After analyzing the implementation and test coverage, I can confirm that the test coverage is thorough and well-designed:
-
The
AssignabilityDeterminerSpec
covers:- Widening conversions (e.g., Int to Double)
- Narrowing conversions (e.g., Long to Int)
- Both strict and loose conversion modes
-
The
TypingResultErrorMessagesSpec
complements with additional cases:- Complex typed objects and nested structures
- Map and collection type conversions
- Tagged values
- Null handling
- Object value comparisons
The test suite effectively covers all the main functionality and edge cases of the AssignabilityDeterminer
, including its public methods isAssignableStrict
and isAssignableLoose
, along with proper error messages.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for the AssignabilityDeterminer implementation to compare against test cases
ast-grep --pattern 'class AssignabilityDeterminer {
$$$
}'
# Look for any additional public methods that might need testing
rg -t scala "def.*public" components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
Length of output: 187
Script:
#!/bin/bash
# Let's first check if the file exists and see its content
fd "AssignabilityDeterminer.scala" --type f
# If found, let's look at its content to understand the implementation
rg -l "AssignabilityDeterminer" --type scala
# Let's also check for the actual implementation
rg -A 5 "object AssignabilityDeterminer|class AssignabilityDeterminer" --type scala
Length of output: 2441
Script:
#!/bin/bash
# Let's examine the implementation of AssignabilityDeterminer to see all its methods
cat components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
# Let's also check if there are any other test files that might contain additional test cases
rg -l "isAssignableStrict|isAssignableLoose" --type scala
Length of output: 12436
Script:
#!/bin/bash
# Let's check the additional test cases in TypingResultErrorMessagesSpec
cat components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala
# Let's also check the typing.scala file for any additional test cases or usage
rg -A 5 "isAssignableStrict|isAssignableLoose" components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala
Length of output: 3782
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala (3)
Line range hint 43-52
: LGTM! Clear validation of complex nested types
The test effectively validates type compatibility for nested map and list structures with clear error messages.
59-68
: LGTM! Comprehensive tagged value validation
The tests effectively cover both tag mismatch scenarios and attempts to convert non-tagged to tagged values with clear error messages.
75-80
: LGTM! Clear null type handling
The tests effectively validate null type conversion rules with appropriate error messages for both directions of conversion attempts.
designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala (2)
59-59
: LGTM: Parameter change aligns with DTO updates
The change to pass dictListRequestDto.expectedType
directly to decodeJson
aligns with the updated DTO structure where expectedType
is now of type Json
.
65-67
: Verify strict conversion behavior for edge cases
The change from canBeSubclassOf
to canBeStrictlyConvertedTo
should fix the issues with incorrect subtypes (e.g., Long for Integer). However, we should verify the behavior for edge cases.
Let's check for test coverage of these edge cases:
Consider documenting the type conversion rules in the codebase to help maintainers understand when strict vs loose conversion should be used.
✅ Verification successful
Let me gather more information about the type conversion implementation and tests.
Let me check one more thing about the strict number conversion implementation.
Type conversion behavior for edge cases is properly implemented
The implementation of canBeStrictlyConvertedTo
is well-designed and handles edge cases appropriately:
-
The strict number conversion logic in
handleStrictNumberConversions
ensures that conversions are only allowed when there's no risk of precision loss:- It uses an ordered list of number types to ensure conversions only happen from larger to smaller types
- Special handling for BigInteger/BigDecimal cases
- Explicit checks for floating-point and decimal numbers
-
The test coverage in
DictApiHttpServiceSpec
includes verification of subclassing behavior for Integer types.
The change from canBeSubclassOf
to canBeStrictlyConvertedTo
is safe and correctly implemented.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Look for tests covering type conversion edge cases
# Expected: Tests for Integer/Long and Number/Long conversions
rg -A 10 "canBeStrictlyConvertedTo" --type scala
rg -A 10 "DictApiHttpServiceSpec" --type scala
Length of output: 7889
Script:
#!/bin/bash
# Look for type conversion tests and implementation details
rg -A 10 "handleNumberConversion" --type scala
rg -A 10 "test.*Number.*conversion" --type scala -i
rg -A 10 "test.*Integer.*Long" --type scala -i
Length of output: 5628
Script:
#!/bin/bash
# Look for strict number conversion implementation
rg -A 15 "handleStrictNumberConversions" --type scala
Length of output: 4510
designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala (4)
87-98
: Well-structured error handling implementation.
The addition of MalformedTypingResult
error handling with consistent error message formatting is a good practice. The error types are well-organized and provide clear feedback to API consumers.
98-100
: Good use of derived schemas.
The switch to derived schemas for DictListRequestDto
, DictEntry
, and DictDto
is a clean approach that ensures type safety and reduces boilerplate. The consistent use of Schema.derived
across all DTOs improves maintainability.
Line range hint 1-134
: Verify alignment with PR objectives.
The changes in this file support the transition from subclass determination to assignability determination. However, we should verify that all consumers of this API are updated to handle the new type system correctly.
✅ Verification successful
Based on the search results, I can now generate the final response:
The changes in DictApiEndpoints are correctly aligned with the PR objectives
The migration from TypingResultInJson
to Json
in DictListRequestDto
is properly documented in the migration guide and all references have been updated. The remaining TypingResultInJson
references found are in NodesApiEndpoints.scala
, which is a separate component and uses the type for different purposes.
- The change is documented in
docs/MigrationGuide.md
- The OpenAPI specification has been updated to reflect the new type
- No direct consumers of the old
TypingResultInJson
type were found in relation to dictionary functionality
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for all consumers of DictListRequestDto
rg -A 5 "DictListRequestDto"
# Search for any remaining references to the old type system
rg -A 5 "TypingResultInJson"
Length of output: 19524
90-90
: Verify type safety with the new Json type.
The change from TypingResultInJson
to Json
makes the API more flexible but might reduce type safety. Ensure that proper validation is implemented in the service layer to maintain type safety.
✅ Verification successful
Let me search for more specific information about how the Json type is handled in the DictApiHttpService.
Let me search for one more specific detail about how the Json type is validated in the DictApiHttpService.
Based on the search results, I can now provide a final response about the type safety concern:
Type safety is properly maintained despite the change to Json type
The change from TypingResultInJson
to Json
in DictListRequestDto
is safe because:
- The
DictApiHttpService
implements proper validation by:- Decoding the Json input using a decoder that validates the type structure
- Handling malformed results with
MalformedTypingResult
error responses - Validating type compatibility using
canBeStrictlyConvertedTo
before returning dictionary entries
The type safety is enforced through validation at runtime with proper error handling, making this change both more flexible and safe.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for validation of expectedType in the service layer
ast-grep --pattern 'def validateExpectedType($json: Json): $_ = {
$$$
}'
# Look for error handling related to malformed Json
rg -A 5 "MalformedTypingResult"
Length of output: 15705
Script:
#!/bin/bash
# Search for DictListRequestDto usage and validation
rg -A 10 "DictListRequestDto"
# Search for expectedType handling in DictApiHttpService
ast-grep --pattern 'expectedType.$$$'
# Look for type validation in DictApiHttpService
rg -A 10 "typingResultDecoder" designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
Length of output: 14216
Script:
#!/bin/bash
# Look for type validation in DictApiHttpService
rg -A 15 "DictApiHttpService" designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
# Search for validation of expectedType in DictApiHttpService
rg -A 10 "expectedType" designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
# Look for typingResult related code
ast-grep --pattern 'typingResult.$$$'
Length of output: 12341
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala (2)
46-46
: LGTM! Consistent varargs type checking.
The change correctly implements assignability checking for varargs parameters, maintaining consistency with the new type system approach.
38-41
: LGTM! Verify array-to-list conversion handling.
The implementation correctly uses the new canBeConvertedTo
method while maintaining special handling for array-to-list conversions. This aligns well with the PR's objective of improving type assignability checks.
Let's verify the array-to-list conversion handling:
✅ Verification successful
Array-to-List conversion implementation verified and working correctly
The implementation is properly supported by the codebase:
DefaultSpelConversionsProvider
registersArrayToListConverter
to handle array-to-list conversions- Test cases in
SpelExpressionSpec
verify array-to-list conversion scenarios with#arrayOfUnknown.toList()
and#arrayOfUnknown.canBeList
- The type conversion logic in
MethodDefinition.scala
correctly handles this case by checking both direct conversion and List conversion possibilities
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for tests or usages that verify array-to-list conversion scenarios
rg -l "ArrayToListConverter|array.*List" --type scala
Length of output: 2243
Script:
#!/bin/bash
# Let's check the DefaultSpelConversionsProvider for array-to-list conversion implementation
rg "array.*List|ArrayToList" -A 5 -B 5 scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/internal/DefaultSpelConversionsProvider.scala
# Also check the test cases in SpelExpressionSpec
rg "array.*List|ArrayToList" -A 5 -B 5 scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala
Length of output: 1845
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala (2)
88-92
: LGTM! The type checking changes align with the PR objectives.
The change from canBeSubclassOf
to canBeConvertedTo
correctly implements the new assignability-based type checking approach.
Line range hint 88-97
: Verify type conversion behavior across the codebase
Let's verify that all callers of this code are prepared for the new type conversion behavior.
✅ Verification successful
Based on the search results, I can now generate the final response:
Type conversion behavior is properly tested and safely used across the codebase
The verification shows:
- All type conversions (String, Long, Boolean) are covered by dedicated test cases in
DictKeyWithLabelExpressionParserTest
- The parser is used safely in the codebase, with proper error handling:
ProcessDictSubstitutor
wraps parsing errors inIllegalStateException
FragmentParameterValidator
andExpressionCompiler
properly map parsing errors to compilation errors- All usages properly handle the parsing results through
Either
type
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for direct usages of DictKeyWithLabelExpressionParser
rg -l "DictKeyWithLabelExpressionParser" | while read -r file; do
echo "=== $file ==="
rg -A 5 "DictKeyWithLabelExpressionParser" "$file"
done
# Search for test cases that verify type conversion behavior
rg -l "DictKeyWithLabelExpressionParser.*Test" | while read -r file; do
echo "=== $file ==="
rg -A 10 "test.*Long|test.*Boolean|test.*String" "$file"
done
Length of output: 6188
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala (2)
63-64
: LGTM: Type checking changes align with new assignability approach
The changes correctly implement the new assignability checking approach for both Scala and Java empty lists, maintaining compatibility with the new AssignabilityDeterminer
design.
Line range hint 71-85
: Verify test coverage for edge cases in type conversion
The test changes appropriately implement the new assignability checking approach. However, given that the PR objectives mention specific issues with Long/Integer/Number type handling, consider adding explicit test cases for these scenarios.
Let's check if there are other test files covering these specific cases:
Consider adding test cases for:
- Long to Integer conversion attempts
- Integer to Long conversion attempts
- Both types to Number conversion scenarios
engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala (1)
113-113
: LGTM! Verify test coverage for string type conversion.
The change from canBeSubclassOf
to canBeConvertedTo
for string type validation is appropriate and aligns with the PR objectives.
Let's verify test coverage for this scenario:
engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala (1)
150-150
: LGTM! Verify integration with existing type conversions.
The change from subclass checking to assignability checking aligns well with the PR objectives and should correctly handle the cases where Long types need to be treated as Numbers. The implementation maintains the method's contract while providing more flexible type conversion.
Let's verify the integration with other type conversion cases:
✅ Verification successful
Based on the search results, I can see that canBeConvertedTo(Typed[Number])
is consistently used across the codebase for type checking numeric values, particularly in:
- Aggregate operations (engine/flink/components/base-unbounded/.../aggregates.scala)
- Mathematical operations (scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala)
- Type conversion handling (components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala)
The change in ExampleFunctions.scala follows the same pattern used throughout the codebase, confirming this is the correct approach for type checking numeric values.
LGTM! The type checking change is consistent with codebase patterns.
The modification from subclass checking to canBeConvertedTo(Typed[Number])
aligns with the established type checking patterns across the codebase, particularly in mathematical operations and aggregate functions.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for consistent type conversion patterns across the codebase
# Look for other instances where Number type conversion is handled
# Search for Number type handling patterns
rg -t scala "Typed\[Number\]|classOf\[Number\]" -A 3
# Look for similar conversion patterns
ast-grep --pattern 'canBeConvertedTo($$$)'
Length of output: 31398
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala (1)
65-65
: LGTM! Verify static method resolution behavior.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's objective of moving to assignability-based type checking. This should provide more accurate type compatibility checks while maintaining the ability to identify valid class types for static method invocation.
Let's verify the impact on static method resolution:
✅ Verification successful
Based on the comprehensive code analysis, I can now generate the final response:
The change to canBeConvertedTo
is correct and maintains static method resolution behavior.
The change from canBeSubclassOf
to canBeConvertedTo
in the isClass
method is safe because:
- The method is used specifically to determine if a type can be treated as a
Class[_]
for static method invocation - The implementation is part of a validation chain that ensures methods exist before attempting to use them
- The code shows proper handling of static vs instance methods through
reference.isStatic
check - Test coverage in
MethodReferenceTyperSpec
verifies method resolution behavior for both static and instance methods
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for test cases involving static method invocation to ensure the behavior is preserved
rg -l "isStatic.*true" --type scala
Length of output: 37
Script:
#!/bin/bash
# Let's try different approaches to find static method related tests and usages
# Search for static method related tests
rg "static.*method" --type scala -i
# Search for Method.isStatic specifically
rg "Method.*isStatic" --type scala
# Look for static method invocation patterns in test files
rg "staticMethod|StaticMethod" --type scala
# Search for class definitions and static method patterns in the typer
ast-grep --pattern 'class MethodReferenceTyper {
$$$
isClass $$$
$$$
}'
Length of output: 49795
Script:
#!/bin/bash
# Let's examine the specific implementation and usage of isClass in MethodReferenceTyper
# Look for the complete implementation of MethodReferenceTyper
ast-grep --pattern 'class MethodReferenceTyper {
$$$
validateMethodsNonEmpty($$$) {
$$$
}
$$$
}'
# Search for tests that verify static method resolution behavior
rg "validateMethodsNonEmpty.*isClass" --type scala -A 5
Length of output: 168
Script:
#!/bin/bash
# Let's try a different approach to understand the context and usage
# Look for the complete file content to understand the context
rg -l "MethodReferenceTyper" --type scala | xargs cat
# Search for tests related to this class
rg "class.*MethodReferenceTyperSpec" --type scala -A 10
# Look for usages of isClass
rg "isClass.*canBeConvertedTo" --type scala -A 3
Length of output: 55941
engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala (2)
108-111
: LGTM! Verify multiset cardinality handling.
The conversion checks for Int
compatibility are appropriate for multiset cardinality values. The changes are consistent with the PR's goal of improving type conversion safety.
Let's verify the multiset handling implementation:
✅ Verification successful
Let me gather more specific information about the multiset cardinality handling.
Based on the gathered information, I can now provide a final response:
The multiset cardinality type checks are correctly implemented
The code correctly ensures type safety for multiset cardinality values by:
- Verifying value types can be converted to
Int
before processing - Properly handling both Scala and Java Map implementations
- Using
NumberUtils.convertNumberToTargetClass
for safe number conversion to Integer - Maintaining consistent type alignment through
alignMultisetType
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for multiset-related test cases and usages
rg -A 5 "TypedMultiset" --type scala
# Search for specific multiset cardinality handling
ast-grep --pattern 'if $_.canBeConvertedTo(Typed[Int])'
Length of output: 6987
Script:
#!/bin/bash
# Search for test cases and implementations related to multiset cardinality handling
rg -A 10 "MultisetType" --type scala
# Look for any related type conversion tests
rg -A 5 "canBeConvertedTo.*Int" --type scala
Length of output: 21861
29-29
: LGTM! Verify number type conversion scenarios.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR objectives and should fix the incorrect subtype handling issues mentioned in the PR description.
Let's verify the number type conversion scenarios:
✅ Verification successful
Number type conversion change is correctly implemented
The change from canBeSubclassOf
to canBeConvertedTo
is verified to be correct based on the following evidence:
- The implementation in
typing.scala
shows thatcanBeConvertedTo
usesAssignabilityDeterminer.isAssignableLoose
which is the proper way to handle number type conversions. - Test cases in
TypingResultSpec
confirm that:- Integer can be converted to Long:
Typed[Int].canBeConvertedTo(Typed[Long]) shouldBe true
- Long can be converted to Integer:
Typed[Long].canBeConvertedTo(Typed[Int]) shouldBe true
- Proper handling of number type hierarchies and conversions is verified
- Integer can be converted to Long:
The change aligns with the type system's design and is properly tested for number type conversions.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for test cases covering number type conversions
rg -l "Number.*Long|Long.*Number|Integer.*Long|Long.*Integer" --type scala
Length of output: 3220
Script:
#!/bin/bash
# Let's check the test cases in the same component to verify the number type conversion scenarios
rg "canBeConvertedTo.*Number|Number.*canBeConvertedTo" engine/flink/components/table/src/test --type scala -C 5
# Also check the implementation of canBeConvertedTo for number types
ast-grep --pattern 'def canBeConvertedTo($target) = {
$$$
}'
Length of output: 178
Script:
#!/bin/bash
# Let's check the implementation of canBeConvertedTo in the typing system
rg "canBeConvertedTo" components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/ -A 5
# And check for related test cases
rg "canBeConvertedTo.*Long|canBeConvertedTo.*Integer" components-api/src/test --type scala -C 5
Length of output: 11851
utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala (1)
132-132
: LGTM! The change aligns with the PR objectives.
The switch from canBeSubclassOf
to canBeConvertedTo
improves type handling accuracy, particularly for cases where incorrect subtypes were being returned (e.g., Long for Integer).
Let's verify the type conversion behavior:
✅ Verification successful
Let me gather more information about the type conversion patterns to ensure consistency.
Based on the analysis of the shell script results, I can now generate the final response:
The change from canBeSubclassOf
to canBeConvertedTo
is correctly implemented and consistent with the codebase.
The verification shows that:
- The change aligns with the broader type system implementation where
canBeConvertedTo
is the standard method for type compatibility checks - The usage in
ToNumberTypingFunction
follows the same pattern as other numeric type conversions in the codebase - Test cases in
TypingResultSpec
confirm thatcanBeConvertedTo
correctly handles Number type hierarchies and conversions - The remaining instances of
canBeSubclassOf
are either in test files or are marked with TODOs for future replacement
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for any remaining instances of canBeSubclassOf in type-related code
# and verify the consistency of the type conversion approach.
# Check for any remaining instances of canBeSubclassOf
rg "canBeSubclassOf" --type scala
# Look for related type conversion patterns
ast-grep --pattern 'canBeConvertedTo(Typed[Number])'
Length of output: 2715
Script:
#!/bin/bash
# Search for all usages of canBeConvertedTo to verify consistent type conversion approach
rg "canBeConvertedTo" --type scala -A 2
# Look for other instances of Number type conversions
rg "Typed\[Number\]" --type scala -A 2
Length of output: 68955
designer/server/src/main/scala/pl/touk/nussknacker/ui/process/test/ScenarioTestService.scala (2)
10-10
: LGTM: Import change aligns with the new type determination approach.
The replacement of CanBeSubclassDeterminer
with AssignabilityDeterminer
is consistent with the PR's objective to improve type handling.
137-137
: Verify the impact of relaxed type conversion on editor assignments.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the new assignability approach. However, this more flexible type conversion might result in more parameters qualifying for the StringParameterEditor
.
Let's verify the impact of this change:
✅ Verification successful
Based on the code analysis performed, I can now generate the final response:
Change from canBeSubclassOf
to canBeConvertedTo
is safe and aligned with type system improvements
The change from canBeSubclassOf
to canBeConvertedTo
in assignUserFriendlyEditor
is part of a broader improvement in the type system. The new canBeConvertedTo
check is more flexible but still maintains type safety:
- It preserves the original intention of allowing string-compatible types to use
StringParameterEditor
- The change is consistent with similar updates in other parts of the codebase (e.g., dict key handling)
- The method is specifically used for UI parameter editing, where string conversion is a common and safe operation
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for test cases or specifications that define the expected editor assignments
rg -A 5 "StringParameterEditor.*canBe(Subclass|Convert)"
# Look for related test files
fd -e scala -e java ".*Test.*" | rg -l "StringParameterEditor"
Length of output: 121
Script:
#!/bin/bash
# Let's try a broader search to understand the usage of StringParameterEditor
rg -A 5 "StringParameterEditor"
# Search for tests related to parameter editors and type conversion
rg -A 10 "assignUserFriendlyEditor.*typ"
# Look for type conversion tests
rg -A 5 "canBeConvertedTo.*String"
Length of output: 83295
Script:
#!/bin/bash
# Let's look for tests related to assignUserFriendlyEditor to understand its behavior
rg -A 10 "assignUserFriendlyEditor.*test"
# Search for any tests that verify the editor assignment behavior
rg -A 10 "ScenarioTestService.*test"
# Look for any changes or references to canBeConvertedTo vs canBeSubclassOf in tests
rg -B 2 -A 2 "canBeSubclassOf.*String"
Length of output: 1060
designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala (1)
79-92
: LGTM!
The test case properly validates the Long type dictionary response, which aligns with the PR's objective of fixing type handling.
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/compile/nodecompilation/FragmentParameterValidator.scala (1)
Line range hint 172-183
: LGTM! The change aligns with the PR objectives.
The switch from canBeSubclassOf
to canBeConvertedTo
correctly implements the new assignability determination logic, which should fix the issues with incorrect subtype handling (e.g., Long for Integer).
Let's verify the impact on existing dictionary type conversions:
✅ Verification successful
Based on the search results, I can see that:
- The test cases in
NodeDataValidatorSpec.scala
verify dictionary type validation, specifically checking for type mismatches (String vs Boolean). - The error handling and display logic in
PrettyValidationErrors.scala
properly formats type mismatch messages. - The
DictIsOfInvalidType
error case class is well-defined and used consistently.
Let's verify one more aspect - the actual type conversion implementations:
Based on the search results, I can now provide a final response:
The change from canBeSubclassOf
to canBeConvertedTo
is correctly implemented and well-tested.
The codebase shows:
- Extensive test coverage in
TypingResultSpec.scala
verifying type conversion behavior, including numeric types (e.g., Long to Integer) - Proper error handling in
DictIsOfInvalidType
for type mismatches - Consistent usage of
canBeConvertedTo
across the codebase for type validation - The change is documented in
MigrationGuide.md
as an intended API change
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Search for test cases and usages that might be affected by the type conversion change
# Look for test cases involving dictionary type validation
rg -A 5 "DictIsOfInvalidType|validateValueInputWithDictEditor"
# Look for dictionary type conversion scenarios in tests
ast-grep --pattern 'test($$$) {
$$$
dictValueType$$$
$$$
}'
Length of output: 7418
Script:
#!/bin/bash
# Search for canBeConvertedTo implementation and related type conversion logic
rg -A 5 "canBeConvertedTo"
# Look for type conversion test cases
rg -A 5 "canBeConvertedTo.*test"
Length of output: 77439
engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala (1)
77-78
: Verify test coverage for the new type conversion logic.
While the existing test cases cover various aggregator scenarios, consider adding specific test cases to verify the new type conversion behavior, especially for edge cases where:
- Types are assignable but not direct subclasses
- Complex nested types with
TypedObjectTypingResult
✅ Verification successful
Existing test coverage is comprehensive for type conversion scenarios
The test coverage for canBeConvertedTo
is already extensive and well-structured, covering:
- Basic type hierarchies (Int -> Long, String -> Object)
- Complex container types (List, Map, Array)
- Type unions and Unknown types
- Null handling
- Tagged types
- Value-based conversions
- Edge cases with generic type parameters
The existing test suite in TypingResultSpec.scala
thoroughly verifies the type conversion behavior, including the specific scenarios mentioned in the review comment:
- Types that are assignable but not direct subclasses (e.g., Int -> Number)
- Complex nested types with TypedObjectTypingResult (Map and nested field scenarios)
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check test coverage for type conversion scenarios
# Look for existing test cases covering type conversion
rg -l "canBeConvertedTo" --type scala --glob "*Spec.scala"
# Check for other test files that might need similar updates
rg -l "canBeSubclassOf" --type scala --glob "*Spec.scala"
Length of output: 427
Script:
#!/bin/bash
# Let's examine the existing test coverage in these files
# Check test cases in AggregatesSpec.scala
rg "canBeConvertedTo" -B 2 -A 2 engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
# Check test cases in TypingResultSpec.scala
rg "canBeConvertedTo" -B 2 -A 2 components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala
# Look for existing type conversion test cases
ast-grep --pattern 'test($$$) { $$ canBeConvertedTo $$ }'
Length of output: 8494
utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala (1)
155-159
: LGTM! The change aligns with the PR objectives.
The switch from canBeSubclassOf
to canBeConvertedTo
for map key validation improves type handling flexibility while maintaining Avro's requirement that map keys must be strings. This change should help resolve the issues with incorrect subtype handling mentioned in the PR objectives.
Let's verify the test coverage for this change:
utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala (2)
373-373
: LGTM: Type checking change aligns with PR objectives
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's goal of improving type handling, particularly for cases involving numeric types like Long and Integer.
Line range hint 368-376
: Verify the impact on Unknown type handling
The method comment mentions concerns about Unknown.canBeSubclassOf(Any)
. Let's verify if similar edge cases exist with the new canBeConvertedTo
approach.
✅ Verification successful
Let me gather more information about Unknown type handling by searching for its implementation.
Let me search for the actual implementation of canBeConvertedTo to verify its behavior with Unknown type.
Let me search for the actual implementation of canBeConvertedTo and its tests to understand how it handles Unknown type.
Let me search for the AssignabilityDeterminer implementation since it handles the actual conversion logic.
Based on the code analysis, I can now provide a final response about the Unknown type handling:
The new canBeConvertedTo
approach handles Unknown type safely and consistently
The AssignabilityDeterminer implementation shows that Unknown type is handled explicitly and safely in the following ways:
case (_, Unknown) => ().validNel
- Any type can be converted to Unknowncase (Unknown, _) => ().validNel
- Unknown can be converted to any type
This matches the original concern about Unknown.canBeSubclassOf(Any)
and maintains the same behavior, but in a more structured way through the AssignabilityDeterminer. The implementation ensures that Unknown type remains flexible while being handled explicitly.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for Unknown type handling across the codebase
rg -A 5 "Unknown.*canBeConvertedTo|canBeConvertedTo.*Unknown"
# Search for test cases involving Unknown type
rg -A 10 "test.*Unknown.*Type"
Length of output: 131163
Script:
#!/bin/bash
# Search for Unknown type definition and implementation
ast-grep --pattern 'class Unknown {
$$$
}'
# Search for Unknown type tests
rg -A 5 "Unknown.*canBeSubclassOf|canBeSubclassOf.*Unknown"
Length of output: 911
Script:
#!/bin/bash
# Search for canBeConvertedTo implementation
ast-grep --pattern 'def canBeConvertedTo($_, $_) {
$$$
}'
# Search for Unknown type definition and its methods
rg -A 10 "case object Unknown"
Length of output: 3965
Script:
#!/bin/bash
# Search for canBeConvertedTo implementation in typing.scala
rg -A 15 "def canBeConvertedTo" components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala
# Search for Unknown type handling tests
rg -A 10 "test.*Unknown.*canBeConvertedTo" components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/
Length of output: 735
Script:
#!/bin/bash
# Search for AssignabilityDeterminer implementation
rg -A 20 "object AssignabilityDeterminer" components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/
# Search for Unknown type handling in AssignabilityDeterminer
rg -A 10 "case.*Unknown.*=>" components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/
Length of output: 18109
engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala (2)
Line range hint 193-197
: LGTM: Type checking improvement in CountWhenAggregator
The change from subclass checking to conversion checking allows for more flexible type handling while maintaining type safety. The error message clearly indicates the expected type.
Line range hint 242-246
: LGTM: Type checking improvement in AverageAggregator
The change to use canBeConvertedTo
is consistent with the new assignability approach. The promotion strategy for large floating numbers is correctly applied after type validation.
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala (2)
414-416
: LGTM! Type checking enhancement aligns with PR objectives.
The change from canBeSubclassOf
to canBeConvertedTo
improves type compatibility checks for Collections and Maps, addressing the issue with incorrect subtype handling (e.g., Long type dictionaries for Integer). This more flexible approach allows for proper type assignability while maintaining type safety.
Line range hint 414-420
: Verify type inference in collection operations.
Since this method is used for type inference in collection operations (e.g., selection and projection), let's verify that the change doesn't affect existing scenarios.
✅ Verification successful
Let me gather more specific information about the type inference and collection operations.
Let me check one more aspect focusing on the SpelExpressionSuggester tests and type handling.
Let me try one final verification focusing specifically on the SpelExpressionSuggester tests.
Based on the collected evidence, I can now generate the final response:
Collection type inference changes appear safe to proceed
The changes to collection type inference maintain the existing behavior:
- The implementation correctly handles both Collection and Map type conversions
- Real-world usage in test files (TableFileSinkTest.scala and RequestResponseTestMainSpec.scala) shows collection operations working with both simple and complex projections
- The changes only affect type inference for collections while preserving the established pattern matching structure
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Search for collection operation usages to verify type inference remains correct
# Look for selection (?[...]) and projection (![...]) operations in tests
echo "Searching for collection operation test cases..."
rg -l "\.[\?!]\[" --type scala --glob "*Test*.scala"
echo "Searching for type conversion test cases..."
ast-grep --pattern 'class $_ extends $_ {
$$$
test("$_") {
$$$
canBeConvertedTo
$$$
}
$$$
}'
Length of output: 625
Script:
#!/bin/bash
# Check the test files that use collection operations
echo "=== TableFileSinkTest.scala ==="
rg -A 5 -B 5 "\.[\?!]\[" engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/table/sink/TableFileSinkTest.scala
echo -e "\n=== RequestResponseTestMainSpec.scala ==="
rg -A 5 -B 5 "\.[\?!]\[" engine/lite/request-response/runtime/src/test/scala/pl/touk/nussknacker/engine/requestresponse/test/RequestResponseTestMainSpec.scala
# Look for SpelExpressionSuggester tests
echo -e "\n=== SpelExpressionSuggester Tests ==="
fd "SpelExpressionSuggester.*Test.*scala" --type f --exec rg -l "typeInference|canBeConvertedTo"
# Check existing type inference tests
echo -e "\n=== Type Inference Tests Content ==="
fd "SpelExpressionSuggester.*Test.*scala" --type f --exec rg -A 10 "typeInference|canBeConvertedTo" {}
Length of output: 1898
Script:
#!/bin/bash
# Search for SpelExpressionSuggester test files
echo "=== Finding SpelExpressionSuggester test files ==="
fd "SpelExpressionSuggester.*Test.*scala"
# Look for collection type handling tests
echo -e "\n=== Collection Type Tests ==="
rg -A 10 "java.util.(Collection|Map)" --type scala --glob "*Test*.scala" scenario-compiler/
# Check for existing type inference implementation
echo -e "\n=== Type Inference Implementation ==="
rg -A 10 "def typeInference" scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala
Length of output: 3211
Script:
#!/bin/bash
# Search for SpelExpressionSuggester test file in scenario-compiler directory
echo "=== Searching for SpelExpressionSuggester test ==="
fd "SpelExpressionSuggester.*Test.*scala" scenario-compiler/
# If found, examine its contents for collection type handling
echo -e "\n=== Checking test contents ==="
fd "SpelExpressionSuggester.*Test.*scala" scenario-compiler/ --exec cat {}
# Look for collection type handling in the main implementation
echo -e "\n=== Main Implementation Collection Handling ==="
rg "canBeConvertedTo.*Collection|canBeConvertedTo.*Map" scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala
Length of output: 803
components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala (8)
35-61
: LGTM! Basic type conversion tests are comprehensive.
The test cases cover essential scenarios for record type conversions, including:
- Subset compatibility
- Field type compatibility
- Nested record compatibility
- Map type compatibility
86-93
: LGTM! Unknown type conversion tests are thorough.
The test cases properly verify that Unknown type is compatible with any other type in both directions.
99-141
: LGTM! Class hierarchy tests cover generic type parameters well.
The tests thoroughly verify type conversion behavior for collections (List, Map) with generic type parameters, including:
- Covariant type parameter handling
- Contravariant type parameter handling
- Array type handling
306-314
: LGTM! Tagged value conversion tests are accurate.
The tests properly verify:
- Same tag compatibility
- Different tag incompatibility
- Tagged to untagged conversion
- Untagged to tagged conversion
318-321
: LGTM! Null type conversion tests are complete.
The tests cover all essential null type scenarios:
- Null to non-null type conversion
- Null to literal type conversion
- Null to null conversion
- Non-null to null conversion
340-345
: LGTM! Instance value conversion tests are thorough.
The tests properly verify conversion behavior for literal/instance values:
- Numeric type widening
- Type compatibility
- Value equality
- Cross-type conversion restrictions
349-353
: LGTM! Date/Time conversion tests cover parsing scenarios.
The tests verify:
- Valid date string to LocalDate conversion
- Valid datetime string to LocalDateTime conversion
- Invalid format handling
- Type mismatch handling
452-452
: LGTM! Property-based tests ensure type system consistency.
The tests use ScalaCheck to verify:
- Reflexive property of type conversion
- Type hierarchy consistency
- Union type behavior
These property-based tests are particularly valuable as they help catch edge cases that might be missed in hand-written tests.
Also applies to: 466-470, 486-491
utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala (1)
359-361
: LGTM! Type conversion enhancement for number handling
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR objectives and improves type handling for numeric collections. This modification allows for more flexible type compatibility checks during collection merging.
Let's verify the type conversion behavior:
✅ Verification successful
Based on the context gathered from the shell scripts, I can now provide the final response:
Type conversion change for numeric collections is safe and well-integrated
The change to use canBeConvertedTo(numberType)
is correctly implemented within the type merging logic pattern matching. The context shows it's properly placed in the fallback cases after more specific type matches (like TypedObjectTypingResult), ensuring type safety while allowing for more flexible numeric type conversions.
The verification shows:
- The change is part of a well-structured type resolution hierarchy in
CollectionMergeTyping
- It's used as a fallback after more specific type matches
- The implementation maintains type safety by wrapping the result in
Typed.genericTypeClass
- The modification preserves the existing type resolution logic flow
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for any existing usages of collection merging with number types
# to ensure the new type conversion logic doesn't break existing functionality
# Search for collection merge operations with number types
ast-grep --pattern 'concat($$$)'
# Search for potential test cases involving number type merging
rg -l "test.*concat.*number" -g "*.scala"
Length of output: 2506
Script:
#!/bin/bash
# Let's try a different approach to find relevant test cases and usages
# Search for test files related to collection typing
rg -l "CollectionMergeTyping" -g "*.scala"
# Look for test cases involving type conversion
rg -l "canBeConvertedTo.*number" -g "*.scala"
# Search for specific pattern in collection.scala to understand the context better
rg -A 5 "canBeConvertedTo\(numberType\)" utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala
Length of output: 886
Script:
#!/bin/bash
# Let's examine the test coverage and implementation details
# Look for test cases in the same directory
rg -A 10 "class.*Test.*Collection" utils/default-helpers/src/test
# Check the full context of the type conversion logic
rg -B 10 -A 10 "canBeConvertedTo\(numberType\)" utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala
# Look for any related type conversion tests
rg "test.*canBeConvertedTo" -g "*.scala"
Length of output: 1286
designer/client/src/http/HttpService.ts (1)
862-862
: LGTM! Verify backend compatibility.
The simplified structure of expectedType
aligns with the PR's objective of transitioning from SubclassDeterminer to AssignabilityDeterminer. The change removes unnecessary nesting, making the code cleaner.
Let's verify if the backend endpoint has been updated to handle this new structure:
✅ Verification successful
Let me gather more information about the type handling changes in the backend.
Based on the search results, I can see that:
- The backend API endpoint in
DictApiHttpService
expects and handles the new structure ofexpectedType
directly:
decoder.decodeJson(dictListRequestDto.expectedType) match {
case Right(expectedType) =>
// ... handles expectedType directly
definition.valueType(id).canBeStrictlyConvertedTo(expectedType)
- The test cases in
DictApiHttpServiceSpec
confirm the backend is already using the new flattened structure:
"expectedType" : {
"type" : "TypedClass",
"refClazzName" : "java.lang.Integer",
...
}
- The change aligns with the transition from
SubclassDeterminer
toAssignabilityDeterminer
as seen in the codebase.
Backend compatibility verified - change is safe
The backend API is already designed to handle the flattened expectedType
structure. The change in the frontend is consistent with the existing backend implementation.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check if the backend endpoint has been updated to handle the new expectedType structure
# Look for related backend changes in ProcessDefinitionData endpoints
# Search for relevant endpoint handlers
rg -t scala "processDefinitionData.*dicts"
# Search for type handling changes
ast-grep --pattern 'def $_(expectedType: $_, $$$)'
Length of output: 1794
Script:
#!/bin/bash
# Search for expectedType handling in Scala files
rg -t scala "expectedType" -B 2 -A 2
# Search for type validation or determination related files
rg -t scala "(SubclassDeterminer|AssignabilityDeterminer)" -B 2 -A 2
# Look for API request handling for dicts
rg -t scala "DictApiHttpService" -B 2 -A 2
Length of output: 73644
docs/MigrationGuide.md (2)
60-64
: LGTM! Clear documentation of typing API changes
The changes to the typing API are well documented, clearly explaining the migration from CanBeSubclassDeterminer.canBeSubclassOf
to AssignabilityDeterminer.isAssignableLoose
and TypingResult.canBeSubclassOf
to TypingResult.canBeConvertedTo
.
Line range hint 1-82
: LGTM! Well-structured migration guide
The migration guide is well-organized and provides clear instructions for upgrading between versions. Each section properly documents breaking changes, API changes, configuration changes and other important modifications.
docs-internal/api/nu-designer-openapi.yaml (2)
Line range hint 1-5084
: Security and error handling implementation looks good!
The API specification demonstrates solid security and error handling practices:
- Proper authentication using HTTP Basic Auth
- Comprehensive error responses with clear messages and examples
- Appropriate use of HTTP status codes
- Consistent error handling patterns across endpoints
Line range hint 1-5084
: API design follows RESTful best practices!
The API specification demonstrates excellent RESTful design:
- Consistent resource naming and URL patterns
- Appropriate use of HTTP methods
- Well-structured request/response schemas
- Clear separation of concerns between endpoints
...onents-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
Show resolved
Hide resolved
...er/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala
Show resolved
Hide resolved
...ts-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala
Show resolved
Hide resolved
...-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala
Show resolved
Hide resolved
...c/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala
Show resolved
Hide resolved
...c/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala
Show resolved
Hide resolved
...c/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala
Show resolved
Hide resolved
designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
Show resolved
Hide resolved
designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (4)
scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala (4)
Line range hint
1118-1139
: Refactor Repeated Test Cases to Avoid DuplicationIn the test
shouldn't allow invalid spel type conversions
, there is repetitive code for similar scenarios. Refactoring the test to iterate over a collection of inputs can enhance maintainability and readability.Apply this diff to refactor the test:
+ val invalidInputs = Seq( + ("'qwerty'", classOf[LocalDate], "LocalDate"), + ("''", classOf[LocalDate], "LocalDate"), + ("'qwerty'", classOf[Currency], "Currency"), + ("'qwerty'", classOf[UUID], "UUID"), + ("'qwerty'", classOf[Locale], "Locale"), + ("'qwerty'", classOf[Charset], "Charset") + ) + + invalidInputs.foreach { case (input, clazz, expectedType) => + inside(parse(clazz, input)) { + case Invalid(NonEmptyList(error: ExpressionParseError, Nil)) => + error.message shouldBe s"Bad expression type, expected: $expectedType, found: String(${input.stripPrefix("'").stripSuffix("'")})" + } + } - inside(parse[LocalDate]("'qwerty'")) { case Invalid(NonEmptyList(error: ExpressionParseError, Nil)) => - error.message shouldBe s"Bad expression type, expected: LocalDate, found: String(qwerty)" - } - - inside(parse[LocalDate]("''")) { case Invalid(NonEmptyList(error: ExpressionParseError, Nil)) => - error.message shouldBe s"Bad expression type, expected: LocalDate, found: String()" - } - - inside(parse[Currency]("'qwerty'")) { case Invalid(NonEmptyList(error: ExpressionParseError, Nil)) => - error.message shouldBe s"Bad expression type, expected: Currency, found: String(qwerty)" - } - - inside(parse[UUID]("'qwerty'")) { case Invalid(NonEmptyList(error: ExpressionParseError, Nil)) => - error.message shouldBe s"Bad expression type, expected: UUID, found: String(qwerty)" - } - - inside(parse[Locale]("'qwerty'")) { case Invalid(NonEmptyList(error: ExpressionParseError, Nil)) => - error.message shouldBe s"Bad expression type, expected: Locale, found: String(qwerty)" - } - - inside(parse[Charset]("'qwerty'")) { case Invalid(NonEmptyList(error: ExpressionParseError, Nil)) => - error.message shouldBe s"Bad expression type, expected: Charset, found: String(qwerty)" - }
Line range hint
1444-1447
: Inline Single-Use Variable in TestIn the test
adding invalid type to number
, the variablefloatAddExpr
is declared but used only once. Inlining the expression can simplify the code.Apply this diff:
- val floatAddExpr = "12.1 + #obj" - parse[Float](floatAddExpr, ctx) shouldBe Symbol("invalid") + parse[Float]("12.1 + #obj", ctx) shouldBe Symbol("invalid")
Line range hint
1577-1595
: Enhance Number Generators for Comprehensive TestingIn the test
should calculate values of operators
, consider expanding the number generators to include edge cases likeInt.MaxValue
,Int.MinValue
,0
, and negative numbers. This will improve test coverage and help catch potential overflow or underflow issues.
Line range hint
1811-1813
: Remove Redundant Evaluation CallIn the test
should allow use no param method property accessor on unknown
, the second call toparsed.evaluateSync[Any](customCtx)
is redundant since it evaluates the same expression with the same context. Removing it will streamline the test.Apply this diff:
val parsed = parse[Any]("#unknownInt.value.toLongOrNull", customCtx).validValue parsed.evaluateSync[Any](customCtx) shouldBe 11 - parsed.evaluateSync[Any](customCtx) shouldBe 11
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (1)
scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala
(1 hunks)
🧰 Additional context used
📓 Learnings (1)
scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala (1)
Learnt from: arkadius
PR: TouK/nussknacker#7174
File: scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala:84-84
Timestamp: 2024-11-19T22:01:38.151Z
Learning: In test classes such as `SpelExpressionSpec.scala` within the `scenario-compiler` module, binary compatibility issues are not a concern because they are used only internally. Therefore, changes to method signatures in these test classes are acceptable.
created: #7262 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice. Previous naming was causing a lot of confusion and now it shouldn't be a problem anymore and the strict check is going to be very useful.
db7265d
to
6925441
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (7)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala (5)
9-18
: Enhance documentation readabilityThe documentation is informative but could benefit from better formatting. Consider:
- Using bullet points consistently
- Adding examples for both loose and strict conversions
- Adding
@see
tags to reference related classes/** - * This class determine whether we can assign one type to another type - that is if its the same class, a subclass or can be converted to another type. We provide two modes of conversion - - * 1. Loose conversion is based on the fact that TypingResults are - * sets of possible supertypes with some additional restrictions (like TypedObjectTypingResult). It is basically how SpEL - * can convert things. Like CommonSupertypeFinder it's in the spirit of "Be type safe as much as possible, but also provide some helpful - * conversion for types not in the same jvm class hierarchy like boxed Integer to boxed Long and so on". - * 2. Strict conversion checks whether we can convert to a wider type. Eg only widening numerical types - * are allowed ( Int -> Long). For other types it should work the same as a loose conversion. + * This class determines whether we can assign one type to another type - that is if it's the same class, + * a subclass, or can be converted to another type. We provide two modes of conversion: + * + * 1. Loose conversion: + * - Based on TypingResults as sets of possible supertypes with additional restrictions + * - Similar to SpEL conversion behavior + * - Allows helpful conversions between types not in the same JVM hierarchy + * - Example: boxed Integer to boxed Long + * + * 2. Strict conversion: + * - Only allows widening type conversions + * - Example: Int -> Long (allowed), Long -> Int (not allowed) + * - For non-numeric types, behaves same as loose conversion + * + * @see CommonSupertypeFinder + * @see TypeConversionHandler */
35-50
: Consider refactoring pattern matching for better maintainabilityThe pattern matching logic could be more maintainable by:
- Extracting common validation logic into named functions
- Using pattern guards for complex conditions
- Adding exhaustiveness checking for all possible type combinations
private def isAssignable(from: TypingResult, to: TypingResult, conversionChecker: ConversionChecker) = { + def handleUnknownTypes(f: TypingResult, t: TypingResult): Option[ValidatedNel[String, Unit]] = (f, t) match { + case (_, Unknown) | (Unknown, _) => Some(().validNel) + case _ => None + } + (from, to) match { - case (_, Unknown) => ().validNel - case (Unknown, _) => ().validNel + case (f, t) if handleUnknownTypes(f, t).isDefined => handleUnknownTypes(f, t).get case (TypedNull, other) => isNullAsignableTo(other) // ... rest of the cases } }
142-178
: Consider performance optimizations for type parameter matchingThe type parameter matching logic performs multiple validations that could be cached or optimized:
- Type parameter validation results could be cached
- Redundant validations for invariant types could be reduced
Consider these optimizations:
- Add a cache for validated type parameter pairs
- Early exit for known invariant type parameters
- Batch validation for multiple type parameters
Example implementation:
private object TypeParameterValidator { private val validationCache = new java.util.concurrent.ConcurrentHashMap[(TypingResult, TypingResult), Boolean]() def validateTypeParameters(from: TypingResult, to: TypingResult): Boolean = { validationCache.computeIfAbsent( (from, to), _ => performValidation(from, to) ) } }
198-201
: Document and consider configurable union type safetyThe current implementation makes a trade-off between type safety and usability. Consider:
- Making the safety level configurable
- Adding runtime validation options
- Improving documentation of the trade-offs
+ /** + * Configuration for union type safety level. + * - STRICT: Requires all union types to be assignable (safer but more restrictive) + * - FLEXIBLE: Allows partial assignability (current behavior) + * - RUNTIME_VALIDATED: Adds runtime validation for partial assignability + */ + object UnionTypeSafetyLevel extends Enumeration { + val STRICT, FLEXIBLE, RUNTIME_VALIDATED = Value + }
216-224
: Consider moving conversion checks to typing phaseThe TODO comment suggests moving conversion checks to the typing phase. This would:
- Improve performance by avoiding repeated checks
- Catch conversion issues earlier in the development cycle
- Provide better error messages
Consider creating a new
TypeCheckingPhase
that handles conversions during typing:trait TypeCheckingPhase { def checkConversions(typingResult: TypingResult): ValidatedNel[String, TypingResult] }Would you like help designing and implementing the typing phase conversion checks?
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala (1)
30-41
: LGTM! Consider enhancing documentation with code examples.The implementation looks good with clear separation of concerns and appropriate use of
final
methods. The documentation clearly explains the difference between loose and strict conversion.Consider enhancing the documentation with:
- Code examples demonstrating loose vs strict conversion
@see
reference to AssignabilityDeterminer/** * Checks if there exists a conversion to a given typingResult, with possible loss of precision, e.g. long to int. * If you need to retain conversion precision, use canBeStrictlyConvertedTo + * + * @example {{{ + * val longType = Typed[Long] + * val intType = Typed[Int] + * assert(longType.canBeConvertedTo(intType)) // true, but may lose precision + * }}} + * @see AssignabilityDeterminer.isAssignableLoose */scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala (1)
398-409
: Consider adding type conversion validation for string concatenationWhile the changes correctly implement assignability checking for both string concatenation and numeric addition, there's a potential edge case to consider.
Consider adding validation for string conversion to ensure all objects can be safely converted to strings. Here's a suggested approach:
- if left.canBeConvertedTo(Typed[String]) || right.canBeConvertedTo(Typed[String]) => + if (left.canBeConvertedTo(Typed[String]) || right.canBeConvertedTo(Typed[String])) && + left.canBeSafelyConverted && right.canBeSafelyConverted =>
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (39)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
(1 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/CanBeSubclassDeterminer.scala
(0 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala
(1 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala
(5 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala
(2 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala
(1 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala
(2 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala
(4 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala
(7 hunks)designer/client/src/http/HttpService.ts
(1 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
(1 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala
(2 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/process/test/ScenarioTestService.scala
(2 hunks)designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
(2 hunks)designer/server/src/test/scala/pl/touk/nussknacker/ui/validation/UIProcessValidatorSpec.scala
(1 hunks)docs-internal/api/nu-designer-openapi.yaml
(1 hunks)docs/MigrationGuide.md
(2 hunks)engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
(1 hunks)engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala
(4 hunks)engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala
(1 hunks)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala
(1 hunks)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala
(3 hunks)engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala
(2 hunks)engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala
(1 hunks)engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/compile/nodecompilation/FragmentParameterValidator.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionValidator.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala
(11 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala
(1 hunks)scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala
(1 hunks)utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala
(1 hunks)utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala
(1 hunks)utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala
(1 hunks)utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala
(1 hunks)
💤 Files with no reviewable changes (1)
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/CanBeSubclassDeterminer.scala
🚧 Files skipped from review as they are similar to previous changes (30)
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala
- components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala
- components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala
- components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala
- components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala
- designer/client/src/http/HttpService.ts
- designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
- designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala
- designer/server/src/main/scala/pl/touk/nussknacker/ui/process/test/ScenarioTestService.scala
- designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
- designer/server/src/test/scala/pl/touk/nussknacker/ui/validation/UIProcessValidatorSpec.scala
- engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
- engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala
- engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala
- engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala
- engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala
- engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala
- engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala
- engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/compile/nodecompilation/FragmentParameterValidator.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionValidator.scala
- utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala
- utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala
- utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala
- utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala
🧰 Additional context used
📓 Learnings (1)
scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala (1)
Learnt from: arkadius
PR: TouK/nussknacker#7174
File: scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala:84-84
Timestamp: 2024-11-19T22:01:38.151Z
Learning: In test classes such as `SpelExpressionSpec.scala` within the `scenario-compiler` module, binary compatibility issues are not a concern because they are used only internally. Therefore, changes to method signatures in these test classes are acceptable.
🪛 Markdownlint (0.35.0)
docs/MigrationGuide.md
82-82: Expected: 2; Actual: 4
Unordered list indentation
(MD007, ul-indent)
🔇 Additional comments (19)
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala (1)
65-65
: LGTM! Verify error handling paths.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's objective of improving type compatibility checks. This should better handle cases where types like Long and Integer need to be compatible.
Let's verify the error handling paths are working as expected:
✅ Verification successful
Based on the evidence gathered from the codebase, I can now generate the final response:
LGTM! Change is well-aligned with type system improvements
The change from canBeSubclassOf
to canBeConvertedTo
is part of a broader refactoring from SubclassDeterminer to AssignabilityDeterminer, which is well-tested as shown by:
- Comprehensive test coverage in AssignabilityDeterminerSpec
- The change maintains the same semantic meaning while aligning with the new type system's flexibility
- The method is used in a validation context where type compatibility is checked, making
canBeConvertedTo
more appropriate as it handles both inheritance and type conversion cases
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for test coverage of error handling paths in MethodReferenceTyper
# Look for test cases covering the isClass check and method validation
rg -A 5 "validateMethodsNonEmpty.*test"
# Look for specific test cases handling Long/Integer compatibility
ast-grep --pattern 'test("$_") { $$ }'
Length of output: 82
Script:
#!/bin/bash
# Let's try a broader search for test files related to MethodReferenceTyper
fd "MethodReferenceTyper.*Test.*scala"
# Search for test cases in Scala files that might cover this functionality
rg -l "MethodReferenceTyper"
# Look for test cases involving Class type handling
rg -A 5 "Typed\[Class"
# Search for tests involving type conversion/compatibility
rg -A 5 "canBeConvertedTo|canBeSubclassOf" --type scala
Length of output: 87673
Script:
#!/bin/bash
# Let's check for test coverage specifically for MethodReferenceTyper
rg -A 10 "class MethodReferenceTyperSpec"
# Look for test cases involving Class type validation
rg -A 10 "validateMethodsNonEmpty.*test" scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/MethodReferenceTyperSpec.scala
# Check for test cases involving canBeConvertedTo with Class types
rg -A 5 "Typed\[Class\[_\]\].*test" scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/MethodReferenceTyperSpec.scala
Length of output: 1736
Script:
#!/bin/bash
# Let's look at the entire test file content to understand test coverage
cat scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/MethodReferenceTyperSpec.scala
# Also check for any other test files that might cover this functionality
rg -l "validateMethodsNonEmpty|isClass.*Class\[_\]" --type scala
# Look for any test cases involving AssignabilityDeterminer since this change is part of that refactor
rg -A 5 "AssignabilityDeterminer.*test" --type scala
Length of output: 8551
Script:
#!/bin/bash
# Let's check for tests related to AssignabilityDeterminer
rg -A 10 "AssignabilityDeterminer.*Spec" --type scala
# Look for tests involving type conversion/compatibility
rg -A 10 "canBeConvertedTo.*test" components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala
# Check for any tests involving Class type compatibility
rg -A 10 "Typed\[Class\[_\]\].*shouldBe" --type scala
Length of output: 1918
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala (3)
78-82
:
Fix field type mismatch error message
The error message for field type mismatch has the expected and actual types reversed.
- s"Field '$name' is of the wrong type. Expected: ${givenFieldType.display}, actual: ${typ.display}"
+ s"Field '$name' is of the wrong type. Expected: ${typ.display}, actual: ${givenFieldType.display}"
214-214
:
Fix assignability error message
The error message has the parameters in reverse order.
- condNel(ClassUtils.isAssignable(from, to, true), (), s"$to is not assignable from $from")
+ condNel(ClassUtils.isAssignable(from, to, true), (), s"$from is not assignable to $to")
53-54
: Address TODO comment about null handling
The TODO comment indicates a potential issue with null handling in typed maps. This should be addressed to prevent type safety issues.
Let's verify the current null handling behavior:
Would you like help implementing proper null handling for typed maps?
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala (2)
144-144
: LGTM: Type comparison update aligns with PR objectives
The change from canBeSubclassOf
to canBeConvertedTo
correctly implements the new assignability determination approach while maintaining the method's purpose of finding the most specific class.
276-276
:
Fix incorrect type comparison logic
The condition returnedResultType.canBeConvertedTo(returnedResultType)
compares the type with itself, which will always return true
. The comparison should be between returnedResultType
and reflectionBasedDefinition.result
.
Apply this diff to fix the condition:
-if (returnedResultType.canBeConvertedTo(returnedResultType)) {
+if (returnedResultType.canBeConvertedTo(reflectionBasedDefinition.result)) {
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala (1)
472-474
: 🛠️ Refactor suggestion
Consider using strict conversion for pattern matching.
The current implementation uses canBeConvertedTo
which allows conversions with potential loss of precision. For pattern matching, this might lead to unexpected matches. Consider using canBeStrictlyConvertedTo
to ensure type-safe pattern matching.
Let's verify if there are any existing pattern matches that might be affected:
- .filter(_.canBeConvertedTo(Typed.fromDetailedType[T]))
+ .filter(_.canBeStrictlyConvertedTo(Typed.fromDetailedType[T]))
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala (8)
149-150
: LGTM: Type checking for parts aligns with assignability approach
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's objective of moving from subclass checking to assignability checking.
Line range hint 201-207
: LGTM: Enhanced indexer type checking with assignability
The indexer type checking now uses canBeConvertedTo
for string type validation, which is more flexible and consistent with the new assignability approach.
Line range hint 359-368
: LGTM: Improved numeric type checking in minus operator
The changes properly handle numeric type checking using assignability rather than subclass relationships. This is particularly important for cases where numeric types need to be converted (e.g., Integer to Long).
451-454
: LGTM: Selection type checking uses assignability
The change to use canBeConvertedTo
for boolean type checking in selections is appropriate and consistent with the overall changes.
523-526
: LGTM: Collection type checking uses assignability
The changes correctly implement assignability checking for collection types, which is particularly important for handling various collection implementations.
Line range hint 578-585
: LGTM: Arithmetic operation type checking enhanced
The implementation properly handles numeric type checking using assignability, which is crucial for correct numeric operations between different number types.
Line range hint 597-604
: LGTM: Single operand arithmetic operation uses assignability
The changes correctly implement assignability checking for single operand arithmetic operations, maintaining consistency with the two-operand operations.
693-696
: LGTM: Iterative type extraction uses assignability
The changes properly implement assignability checking for collection and map types, which is essential for correct type inference in iterations.
scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala (1)
2047-2047
: LGTM! This test case effectively validates the new assignability determiner.
The test verifies that the type compatibility is preserved during map-to-list-to-map conversion using the new canBeConvertedTo
check. This is a crucial test case that ensures the new AssignabilityDeterminer correctly handles type relationships in complex conversions.
docs/MigrationGuide.md (2)
60-64
: LGTM! Clear documentation of typing API changes
The documentation clearly explains the breaking change from CanBeSubclassDeterminer.canBeSubclassOf
to AssignabilityDeterminer.isAssignableLoose
and the related changes to TypingResult
methods.
81-82
: LGTM! Clear documentation of DictApiEndpoints changes
The documentation clearly states the change in DictListRequestDto
expectedType
field from TypingResultInJson
to Json
.
🧰 Tools
🪛 Markdownlint (0.35.0)
82-82: Expected: 2; Actual: 4
Unordered list indentation
(MD007, ul-indent)
docs-internal/api/nu-designer-openapi.yaml (1)
5083-5083
:
Fix incomplete type definition in DictListRequestDto
The empty expectedType
property definition needs proper typing to work with the new assignability determiner. This is especially important since this PR changes the type handling from subclass determination to assignability determination.
As previously suggested, update the property definition to use proper type references:
- expectedType: {}
+ expectedType:
+ description: Expected type for dictionary validation
+ oneOf:
+ - $ref: '#/components/schemas/TypedClass'
+ - $ref: '#/components/schemas/TypedDict'
+ - $ref: '#/components/schemas/TypedObjectTypingResult'
+ - $ref: '#/components/schemas/TypedObjectWithValue'
+ - $ref: '#/components/schemas/TypedTaggedValue'
…yped/SubclassDeterminerSpec.scala Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
* simplify conversion check * changes to TypeConversionHandler * test fixes --------- Co-authored-by: Marcel Philipiak <[email protected]>
…. TypeConversionHandler logic branchout
…ApiHttpServiceSpec.scala Co-authored-by: Mateusz Słabek <[email protected]>
…yped/AssignabilityDeterminer.scala Co-authored-by: Mateusz Słabek <[email protected]>
6925441
to
d4f7735
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (2)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala (1)
30-41
: Consider documenting migration guidelines for the new type system.The transition from subclass checking to conversion checking is a significant architectural change. While the implementation is solid, consider:
- Documenting migration guidelines for teams using the old
canBeSubclassOf
- Adding examples of when to use strict vs. loose conversion
- Updating relevant documentation about pattern matching behavior
Also applies to: 472-474
docs/MigrationGuide.md (1)
86-87
: Fix indentation for consistency.The indentation should be 2 spaces to match the rest of the document's formatting style.
- * `DictListRequestDto` `expectedType`: TypingResultInJson -> Json + * `DictListRequestDto` `expectedType`: TypingResultInJson -> Json🧰 Tools
🪛 Markdownlint (0.35.0)
87-87: Expected: 2; Actual: 4
Unordered list indentation(MD007, ul-indent)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (39)
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
(1 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/CanBeSubclassDeterminer.scala
(0 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala
(1 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala
(5 hunks)components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala
(2 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala
(1 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala
(2 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala
(4 hunks)components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala
(7 hunks)designer/client/src/http/HttpService.ts
(1 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
(1 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala
(2 hunks)designer/server/src/main/scala/pl/touk/nussknacker/ui/process/test/ScenarioTestService.scala
(2 hunks)designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
(2 hunks)designer/server/src/test/scala/pl/touk/nussknacker/ui/validation/UIProcessValidatorSpec.scala
(1 hunks)docs-internal/api/nu-designer-openapi.yaml
(1 hunks)docs/MigrationGuide.md
(2 hunks)engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
(1 hunks)engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala
(4 hunks)engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala
(1 hunks)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala
(1 hunks)engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala
(3 hunks)engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala
(2 hunks)engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala
(1 hunks)engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/compile/nodecompilation/FragmentParameterValidator.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala
(2 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionValidator.scala
(1 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala
(11 hunks)scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala
(1 hunks)scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala
(1 hunks)utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala
(1 hunks)utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala
(1 hunks)utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala
(1 hunks)utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala
(1 hunks)
💤 Files with no reviewable changes (1)
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/CanBeSubclassDeterminer.scala
🚧 Files skipped from review as they are similar to previous changes (32)
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminer.scala
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/NumberTypeUtils.scala
- components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/TypeConversionHandler.scala
- components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/AssignabilityDeterminerSpec.scala
- components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypedFromInstanceTest.scala
- components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultErrorMessagesSpec.scala
- components-api/src/test/scala/pl/touk/nussknacker/engine/api/typed/TypingResultSpec.scala
- designer/client/src/http/HttpService.ts
- designer/server/src/main/scala/pl/touk/nussknacker/ui/api/DictApiHttpService.scala
- designer/server/src/main/scala/pl/touk/nussknacker/ui/api/description/DictApiEndpoints.scala
- designer/server/src/main/scala/pl/touk/nussknacker/ui/process/test/ScenarioTestService.scala
- designer/server/src/test/scala/pl/touk/nussknacker/ui/api/DictApiHttpServiceSpec.scala
- designer/server/src/test/scala/pl/touk/nussknacker/ui/validation/UIProcessValidatorSpec.scala
- engine/flink/components/base-tests/src/test/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/AggregatesSpec.scala
- engine/flink/components/base-unbounded/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/aggregate/aggregates.scala
- engine/flink/components/base/src/main/scala/pl/touk/nussknacker/engine/flink/util/transformer/ForEachTransformer.scala
- engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/sink/TableTypeOutputValidator.scala
- engine/flink/components/table/src/main/scala/pl/touk/nussknacker/engine/flink/table/utils/ToTableTypeSchemaBasedEncoder.scala
- engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/DocumentationFunctions.scala
- engine/flink/management/dev-model/src/main/scala/pl/touk/nussknacker/engine/management/sample/global/ExampleFunctions.scala
- engine/lite/components/base/src/main/scala/pl/touk/nussknacker/engine/lite/components/ForEachTransformer.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/compile/nodecompilation/FragmentParameterValidator.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodDefinition.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/language/dictWithLabel/DictKeyWithLabelExpressionParser.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSuggester.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/SpelExpressionValidator.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/Typer.scala
- scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/spel/typer/MethodReferenceTyper.scala
- utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/collection.scala
- utils/default-helpers/src/main/scala/pl/touk/nussknacker/engine/util/functions/numeric.scala
- utils/json-utils/src/main/scala/pl/touk/nussknacker/engine/json/encode/JsonSchemaOutputValidator.scala
- utils/schemed-kafka-components-utils/src/main/scala/pl/touk/nussknacker/engine/schemedkafka/encode/AvroSchemaOutputValidator.scala
🧰 Additional context used
📓 Learnings (1)
scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala (1)
Learnt from: arkadius
PR: TouK/nussknacker#7174
File: scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala:84-84
Timestamp: 2024-11-19T22:01:38.151Z
Learning: In test classes such as `SpelExpressionSpec.scala` within the `scenario-compiler` module, binary compatibility issues are not a concern because they are used only internally. Therefore, changes to method signatures in these test classes are acceptable.
🪛 Markdownlint (0.35.0)
docs/MigrationGuide.md
87-87: Expected: 2; Actual: 4
Unordered list indentation
(MD007, ul-indent)
🔇 Additional comments (11)
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/MethodTypeInfoSubclassChecker.scala (4)
15-15
: LGTM! The conversion check is correctly implemented.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's objective to move from subclass checking to assignability checking.
Note: A previous review comment already suggested updating the error messages in NotSubclassVarArgument
to reflect the new assignability semantics.
41-41
: LGTM! The conversion check is correctly implemented.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's objective to move from subclass checking to assignability checking.
Note: A previous review comment already suggested updating the error messages in NotSubclassArgument
to reflect the new assignability semantics.
49-49
: LGTM! The conversion check is correctly implemented.
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's objective to move from subclass checking to assignability checking.
Note: A previous review comment already suggested updating the error messages in NotSubclassResult
to reflect the new assignability semantics.
Line range hint 1-100
: Consider renaming the file and class to reflect its new responsibility
The class name MethodTypeInfoSubclassChecker
no longer accurately reflects its responsibility since it now performs assignability checking rather than subclass checking. Consider renaming to something like MethodTypeInfoAssignabilityChecker
to maintain consistency with the new approach.
Let's verify the impact of this rename:
scenario-compiler/src/main/scala/pl/touk/nussknacker/engine/definition/clazz/ClassDefinitionExtractor.scala (2)
144-144
: LGTM! Appropriate change for handling generic return types
The change from canBeSubclassOf
to canBeConvertedTo
aligns with the PR's objective of improving type compatibility checks. This modification allows for more flexible type assignability determination when dealing with generic return types, which is particularly important for handling cases like ChronoLocalDateTime<D>
and LocalDateTime
.
Line range hint 276-285
: Fix incorrect type comparison and update error message
- The condition
returnedResultType.canBeConvertedTo(returnedResultType)
compares the type with itself, which will always returntrue
. This needs to be fixed. - The error message still refers to "subclass" which is inconsistent with the new assignability-based approach.
Apply this diff to fix both issues:
- if (returnedResultType.canBeConvertedTo(returnedResultType)) {
+ if (returnedResultType.canBeConvertedTo(reflectionBasedDefinition.result)) {
returnedResultType
} else {
logger.warn(
s"ClassExtractionSettings defined typing function for class: ${clazz.getName}, member: ${member.getName} " +
- s"which returned result type $returnedResultType that can't be a subclass of type ${reflectionBasedDefinition.result} " +
+ s"which returned result type $returnedResultType that can't be converted to type ${reflectionBasedDefinition.result} " +
s"computed using reflection. Will be used type computed using reflection"
)
reflectionBasedDefinition.result
components-api/src/main/scala/pl/touk/nussknacker/engine/api/typed/typing.scala (2)
30-41
: LGTM! Well-documented type conversion methods.
The implementation clearly separates loose and strict conversion logic with appropriate documentation. The method names and return types are well-chosen.
Let's verify the consistent usage of these new methods across the codebase:
✅ Verification successful
Based on the search results, I can now generate the final response. The results show:
- The old
canBeSubclassOf
method has been replaced withcanBeConvertedTo
andcanBeStrictlyConvertedTo
across the codebase - The migration guide (docs/MigrationGuide.md) documents this change
- The new methods are being used consistently in various components for type checking and conversion validation
- The test coverage (TypingResultSpec.scala) is comprehensive, covering various type conversion scenarios
- No inconsistencies or issues found in the usage patterns
LGTM! The new type conversion methods are consistently used across the codebase.
The migration from canBeSubclassOf
to canBeConvertedTo
and canBeStrictlyConvertedTo
has been properly implemented. The remaining references to canBeSubclassOf
in the codebase are either in documentation (migration guide) or in TODO comments discussing historical context.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Check for any remaining uses of old canBeSubclassOf method and verify consistent usage of new methods
# Check for any remaining canBeSubclassOf usage
echo "Checking for any remaining canBeSubclassOf usage..."
rg "canBeSubclassOf"
# Check usage patterns of new methods
echo "Analyzing usage patterns of new conversion methods..."
rg -A 2 "canBeConvertedTo|canBeStrictlyConvertedTo"
Length of output: 55602
472-474
: Verify pattern matching behavior with the new type conversion.
The update to use canBeConvertedTo
is correct, but since this affects pattern matching, we should ensure it maintains expected behavior across different scenarios.
Let's analyze pattern matching usage:
✅ Verification successful
Based on the search results, I can now provide a conclusive response about the pattern matching behavior with the new type conversion:
Pattern matching behavior is correctly maintained with canBeConvertedTo
The change from the previous method to canBeConvertedTo
is safe because:
- The
CastTypedValue
is used in pattern matching withinTyper.scala
for type-safe operations, specifically inwithTwoChildrenOfType
method - The implementation maintains the same type safety guarantees by checking type compatibility before creating the
TypingResultTypedValue
- The
canBeConvertedTo
method is consistently used throughout the codebase for type compatibility checks, as evidenced by its usage in similar pattern matching scenarios
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Find pattern matching usage of CastTypedValue to verify behavior
# Search for pattern matching usage
echo "Analyzing CastTypedValue pattern matching usage..."
ast-grep --pattern 'match {
case $value: CastTypedValue[$_] => $$$
}'
# Search for direct unapply calls
echo "Checking direct unapply usage..."
rg "CastTypedValue.*unapply"
Length of output: 290
Script:
#!/bin/bash
# Let's try a different approach to find CastTypedValue usage
# Search for CastTypedValue class references and usages
echo "Searching for CastTypedValue references..."
rg "CastTypedValue" -A 3 -B 3
# Look for files containing type conversion patterns
echo "Searching for type conversion patterns..."
rg "canBeConvertedTo|asInstanceOf" --type scala -A 2 -B 2
# Find test cases that might verify this behavior
echo "Looking for related test cases..."
fd "Test.*scala" --exec rg -l "CastTypedValue|canBeConvertedTo" {}
Length of output: 101425
scenario-compiler/src/test/scala/pl/touk/nussknacker/engine/spel/SpelExpressionSpec.scala (1)
2047-2047
: LGTM! Good improvement to type compatibility checking
The change from canBeSubclassOf
to canBeConvertedTo
aligns well with the new assignability determination approach. This makes the type compatibility check more flexible and accurate, especially when dealing with collection type conversions.
docs/MigrationGuide.md (1)
65-69
: LGTM! Clear documentation of typing API changes.
The documentation clearly explains the changes to typing-related APIs, which is crucial for users migrating between versions.
docs-internal/api/nu-designer-openapi.yaml (1)
5083-5083
: Fix incomplete schema definition for expectedType
The expectedType
property in DictListRequestDto
is still defined as an empty object ({}
). This is problematic as it:
- Doesn't properly define the type structure needed for dictionary type validation
- Could cause runtime issues in API implementations
- Contradicts the PR's focus on improving type handling
This issue was previously identified. The property should be properly defined with type information as suggested in the past review.
Refactored SubclassDeterminer to AssignabilityDeterminer and forked logic into "strict" and "loose" conversion logic. --------- Co-authored-by: Marcel Philipiak <[email protected]> Co-authored-by: Mateusz Słabek <[email protected]>
Refactored SubclassDeterminer to AssignabilityDeterminer and forked logic into "strict" and "loose" conversion logic. --------- Co-authored-by: Marcel Philipiak <[email protected]> Co-authored-by: Mateusz Słabek <[email protected]>
Refactored SubclassDeterminer to AssignabilityDeterminer and forked logic into "strict" and "loose" conversion logic. --------- Co-authored-by: Marcel Philipiak <[email protected]> Co-authored-by: Mateusz Słabek <[email protected]>
Refactored SubclassDeterminer to AssignabilityDeterminer and forked logic into "strict" and "loose" conversion logic. --------- Co-authored-by: Marcel Philipiak <[email protected]> Co-authored-by: Mateusz Słabek <[email protected]>
Describe your changes
DictApiHttpService sometimes returns improper subtypes like Long type Dicts for Integer or won't return proper types like Number type dict for Long.
Fixed by refactoring SubclassDeterminer to AssignabilityDeterminer and forking logic into "strict" and "loose" conversion logic.
Copied from #7115
Checklist before merge
Summary by CodeRabbit
Release Notes
New Features
AssignabilityDeterminer
for enhanced type assignability checks with loose and strict conversion modes.Improvements
Bug Fixes
Documentation