Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade to AWS SDK 2.0 #155

Closed
capmorganbih opened this issue Sep 6, 2019 · 33 comments
Closed

Upgrade to AWS SDK 2.0 #155

capmorganbih opened this issue Sep 6, 2019 · 33 comments

Comments

@capmorganbih
Copy link

No description provided.

@capmorganbih capmorganbih changed the title Question: is moving to AWS SDK 2.x is planned ? Question: Is moving to AWS SDK 2.0 planned ? Sep 6, 2019
@artembilan
Copy link
Member

It is. When all the dependencies we use are based on SDK v2 already.

@dmytrodanilenkov
Copy link

@artembilan there are any blockers of implementing this migration to AWS v2?

@artembilan
Copy link
Member

Yes, this one is the biggest one: awspring/spring-cloud-aws#20

@Ravibs139
Copy link

Schema Registry integration is available in KPL v0.14.2 or later and with KCL v2.3 or later. Please plan to include schema registry as part of next major upgrade. https://docs.aws.amazon.com/glue/latest/dg/schema-registry-integrations.html

@maciejwalkowiak
Copy link

maciejwalkowiak commented May 6, 2022

Just an update from Spring Cloud AWS side: we are upgrading to AWS SDK v2 - in upcoming weeks we are releasing 3.0 M1 with SNS, SES and S3 integrations (already merged to main), later this year another milestone with DynamoDB and SQS. Once it's done we can prepare a PR for spring-cloud-integration-aws.

As far as I can see there are more blockers for migration though: awslabs/amazon-dynamodb-lock-client#48

@artembilan
Copy link
Member

Sounds good, @maciejwalkowiak !

I think I will start looking into this next month, perhaps in the end we can good realign the migration to AWS SDK v2 together with upcoming major release of Spring Cloud 2022.0.0.

@jaychapani
Copy link

Schema Registry integration is available in KPL v0.14.2 or later and with KCL v2.3 or later. Please plan to include schema registry as part of next major upgrade. https://docs.aws.amazon.com/glue/latest/dg/schema-registry-integrations.html

We are also in the need of this feature. Any ETA for this implementation? @Ravibs139 - Did you find any workaround to this?

@artembilan
Copy link
Member

I have just recently added Glue Schema support into KplMessageHandler: #206.

We will look into KCL v2.x in the nearest future.

@Ravibs139
Copy link

Schema Registry integration is available in KPL v0.14.2 or later and with KCL v2.3 or later. Please plan to include schema registry as part of next major upgrade. https://docs.aws.amazon.com/glue/latest/dg/schema-registry-integrations.html

We are also in the need of this feature. Any ETA for this implementation? @Ravibs139 - Did you find any workaround to this?

@jaychapani I used Spring custom MessageConverter for Glue scheme registry and it solved the serialization/deserialization to work with Glue. You need to implement your own Converter for Glue serde and register that as spring bean

@jaychapani
Copy link

@Ravibs139 - I would appreciate any example or pointer for the implementation.

@Ravibs139
Copy link

Ravibs139 commented Jun 7, 2022

@jaychapani you can use these sample code and tailored it for your needs.

    @Bean
    public MessageConverter userMessageConverter() {
        return new GlueMessageConverter();
    }

    @Bean
    public GlueSchemaRegistryConfiguration schemaRegistryConfiguration() {
        GlueSchemaRegistryConfiguration configs = new GlueSchemaRegistryConfiguration(regionName);
        //Optional setting to enable auto-registration.
        configs.setRegistryName(registry);
        configs.setSchemaAutoRegistrationEnabled(true);
        return configs;
    }
import com.amazonaws.services.schemaregistry.common.configs.GlueSchemaRegistryConfiguration;
import com.amazonaws.services.schemaregistry.deserializers.GlueSchemaRegistryDeserializerImpl;
import com.amazonaws.services.schemaregistry.serializers.GlueSchemaRegistrySerializerImpl;
import com.amazonaws.services.schemaregistry.utils.AVROUtils;
import software.amazon.awssdk.auth.credentials.DefaultCredentialsProvider;
import software.amazon.awssdk.services.glue.model.DataFormat;
import org.apache.avro.Schema;
import org.apache.avro.generic.GenericDatumWriter;
import org.apache.avro.io.Encoder;
import org.apache.avro.io.EncoderFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cloud.stream.schema.avro.AvroSchemaMessageConverter;
import org.springframework.lang.Nullable;
import org.springframework.messaging.Message;
import org.springframework.messaging.MessageHeaders;
import org.springframework.messaging.converter.MessageConversionException;
import org.springframework.util.MimeType;

import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.util.Collections;
import java.util.UUID;

public class GlueMessageConverter extends AvroSchemaMessageConverter {

    private static final String STREAM_NAME = "stream";

    @Autowired
    private GlueSchemaRegistryConfiguration glueSchemaRegistryConfig;

    public GlueMessageConverter(){
        super(Collections.singletonList(new MimeType("application", "avro")));
    }

    @Override
    protected Object convertFromInternal(Message<?> message, Class<?> targetClass, @Nullable Object conversionHint) {

        Object result;
        byte[] payload = (byte[]) message.getPayload();
        //The following lines remove the schema registry header
        GlueSchemaRegistryDeserializerImpl glueSchemaRegistryDeserializer =
                new GlueSchemaRegistryDeserializerImpl(DefaultCredentialsProvider.builder().build(), glueSchemaRegistryConfig);

        byte[] record = glueSchemaRegistryDeserializer.getData(payload);

        com.amazonaws.services.schemaregistry.common.Schema awsSchema;
        Schema avroSchema;
        awsSchema = glueSchemaRegistryDeserializer.getSchema(payload);
        avroSchema = new Schema.Parser().parse(awsSchema.getSchemaDefinition());

        //The following lines serialize an AVRO schema record
        try {
                result = avroSchemaServiceManager().readData(targetClass, record, avroSchema, avroSchema);
            }catch (IOException e) {
                throw new MessageConversionException(message, "Failed to read payload", e);
            }
        return result;
    }

    @Override
    protected Object convertToInternal(Object payload, @Nullable MessageHeaders headers, @Nullable Object conversionHint) {

        byte[] recordAsBytes;
        recordAsBytes = convertRecordToBytes(payload);

        String schemaDefinition = AVROUtils.getInstance().getSchemaDefinition(payload);
        String schema_name =AVROUtils.getInstance().getSchema(payload).getFullName();

        //The following lines add a Schema Header to a record
        com.amazonaws.services.schemaregistry.common.Schema awsSchema;
        awsSchema = new com.amazonaws.services.schemaregistry.common.Schema(schemaDefinition, DataFormat.AVRO.name(),
                        schema_name);
        GlueSchemaRegistrySerializerImpl glueSchemaRegistrySerializer =
                new GlueSchemaRegistrySerializerImpl(DefaultCredentialsProvider.builder().build(),
                        glueSchemaRegistryConfig);
        byte[] recordWithSchemaHeader =
                glueSchemaRegistrySerializer.encode(STREAM_NAME, awsSchema, recordAsBytes);
        return recordWithSchemaHeader;
    }

    private byte[] convertRecordToBytes(final Object record) throws MessageConversionException {
        ByteArrayOutputStream recordAsBytes = new ByteArrayOutputStream();
        try{
            Encoder encoder = EncoderFactory.get().directBinaryEncoder(recordAsBytes, null);
            GenericDatumWriter datumWriter = new GenericDatumWriter<>(AVROUtils.getInstance().getSchema(record));
            datumWriter.write(record, encoder);
            encoder.flush();
        } catch (IOException e) {
            throw new MessageConversionException("Failed to write payload", e);
        }
        return recordAsBytes.toByteArray();
    }

    @Override
    protected boolean supports(Class<?> aClass) {
        return true;
    }
}

@paparaju
Copy link

paparaju commented Sep 8, 2022

@artembilan Is there any ETA on AWS SDK 2.0 migration or this spring integration to aws is kind of not on priority?. We adopted AWS Kinesis stream binder and it was helpful but it is on AWS 1.x SDK. We want to upgrade to AWS sdk 2.0 , but this needs to be resolved first. Would consider other options if you guys are not planning to implement this in near future or may consider contributing to AWS SDK 2. 0 upgradation. Please let me know

@artembilan
Copy link
Member

We are really short-handed at the moment and contribution is welcome ! Otherwise I’ll try to look into this again next month.

@maciejwalkowiak
Copy link

maciejwalkowiak commented Sep 9, 2022

Spring Cloud AWS 3.0 M2 built on top of AWS SDK v2 is already released - since we've changed a lot of things it is possible that during migration some missing functionalities will be discovered - likely we can add them then.

I've started to migrate Spring Integration AWS but at this stage it's not easy even to get to the state where it compiles (https://github.com/maciejwalkowiak/spring-integration-aws/tree/aws-sdk-v2)

From what I can see other dependencies that have not been migrated yet to AWS SDK v2:

Unfortunately there seems to be no clear timeline when/if the migration will happen.

@artembilan do you consider releasing a version that uses both AWS SDK v1 and v2, or you prefer to wait for all the dependencies to be migrated?

@artembilan
Copy link
Member

Thank you, @maciejwalkowiak , for looking into this!

That's what I wanted to ask you: can we migrate services one by one?
For example move everything what you have migrated in SC AWS to SKD v2, but still keep v1 for those which has not migrated yet?
I don't think there is a problem for AWS protocol by itself: it is just Java API concern.

To be short: can we have both SDKs as dependencies?
For example, we migrate SQS, SNS, S3 to v2, but still keep that DynamoDbLockRegistry on v1 since the mentioned lock-client has not migrated yet.

@maciejwalkowiak
Copy link

Yes, technically it is possible - both AWS SDK v1 and v2 can be on the classpath. This would require a bit of duplication in the spring-integration-aws itself and not sure if there's a chance to get it working with Spring Native as only AWS SDK v2 is compatible with GraalVM native image AFAIK.

I'll see how far I can take my branch but I do it only in meantime so please don't rely on me with this upgrade.

@dmytrodanilenkov
Copy link

dmytrodanilenkov commented Sep 9, 2022

I see in pom.xml on master branch that DynamoDB Lock Client build on AWS SDK version 2.3.5. Sounds like AWS SDK 2.0. Am I wrong? What is the problem of releasing it?

@artembilan
Copy link
Member

Looks like that. Thank you!
But asking here for release of that library is wrong.
It is better to relay your question in the mentioned issue

@saurabhygk
Copy link

We are really short-handed at the moment and contribution is welcome ! Otherwise I’ll try to look into this again next month.

Hi @artembilan, is there anything where I can contribute to releasing AWS SDK 2 with Spring cloud? I am happy to contribute if anything.

@artembilan
Copy link
Member

Hi @saurabhygk !

Not sure what scope you are talking about, but for Spring Cloud AWS it is better to ask @maciejwalkowiak since he is leading respective project: https://awspring.io/.

From Spring Integration extension perspective let's also ask @maciejwalkowiak to share his branch with us, so you may take over it for the remaining tasks.

Either way keep in mind those two blockers for us: #155 (comment).

@dmytrodanilenkov ,

any new about releasing amazon-dynamodb-lock-client with AWS SDK v2 ?

Thank you all!

@paparaju
Copy link

paparaju commented Nov 3, 2022

We are on spring-cloud-stream-binder-kinesis of version 2.2.0(latest) and it is on KCL 1.x. We want to migrate to KCL2.x but looks like AWS SDK 2.0 migration is blocking this. Is there any other alternatives present for this?.Otherwise we are missing on the features related to KCL2.x already so we want to consider other options. Can someone please help with my above question here?

@artembilan
Copy link
Member

This one is a huge blocker: awslabs/amazon-kinesis-producer#296, which is another part of Spring Cloud Stream binding feature.

It would be a tremendous burden for us to support both SDKs on our side.

@chrylis
Copy link
Contributor

chrylis commented Dec 1, 2022

I see that AWSpring is approaching a 3.0 release (it looks like 3.0.0.M3 is out, and M4 is in progress). Has work begun on a Spring Integration AWS 3, or is that being held until an AWSpring GA with a final API is published?

@artembilan
Copy link
Member

@chrylis , see this my comment: #155 (comment).

I don't know what else I can add: we are just blocked here and short-handed to consider supporting both SDK versions at the same time.
Therefore Spring Integration AWS 3.0 is deferred until we got solved all the upstream blockers.

Does it make sense to you?

@chrylis
Copy link
Contributor

chrylis commented Dec 1, 2022

@artembilan I understand and was checking to confirm that the situation was the same as earlier and whether a 3.x existed somewhere or was blocked until AWSpring 3 is final. Thanks for the update.

@maciejwalkowiak
Copy link

@artembilan DynamoDB Lock Client based on AWS SDK v2 got released: awslabs/amazon-dynamodb-lock-client#48 (comment)

@artembilan
Copy link
Member

Thank you, @maciejwalkowiak !

Saw that, but you know we decided to go ahead with our own DynamoDB lock impl: https://github.com/spring-projects/spring-integration-aws/blob/main/src/main/java/org/springframework/integration/aws/lock/DynamoDbLockRepository.java.

That AWS library does not do a proper TTL and does not honor Lock contract.
So, starting with our version 3.0.0 we are going to drop that amazon-dynamodb-lock-client dependency and go ahead with our own more stable solution.

From now on only KPL is our bottle neck. Although that one might deal with its own internal protocol (UserRecord) into which we simply can remap from Kinesis v2 API.

So, if you have some PR about migrating this project to SC-AWS as we chatted before I'd be glad to review and merge it shorty 😄

@maciejwalkowiak
Copy link

I didn't manage to get that far yet. Let me try again, will post an update as soon as I have something ready.

@artembilan
Copy link
Member

Heads up!

I have just pushed a migration to AWS SDK v2.
There are still some failing tests (they are @Disabled at the moment), but I wanted to share with you whatever is compiled now and fully migrated to long-standing request.

So, turned out we use our own DynamoDB lock implementation now. Therefore no need to wait for respective library from AWS Labs.

The KPL stays independently and does not impact our usage of AWS SDK v2. We just simply use its high-level API.
The KinesisProducerConfiguration perhaps can simply be adapter for its com.amazonaws.auth.AWSCredentialsProvider expectations from our auto-configured software.amazon.awssdk.auth.credentials.AwsCredentialsProvider.
What is strange though, thy already use AWS SDK v2 for GlueSchemaRegistry 🤷

The DynamoDB Streams Kinesis Adapter is out of use already since now we easy can assign a Kinesis Stream to DynamoDB table with respective configuration or API.

The KCL channel adapter no also supports a Glue Schema deserialization.

I would like to get some feedback and possible contributions to tests to make a good progress with this project.

Thank you all for your support and patience! 😄

@maciejwalkowiak
Copy link

Great news @artembilan! I am sorry I wasn't more helpful with this issue.

@artembilan
Copy link
Member

artembilan commented Mar 16, 2023

OK. So, all the tests are now working after some cycles of bug-fixes and migrating to Testcontainers.
Only one remains disabled: the KPL-based. Looks like their native daemon still tries to connect to EC2 and don't see credentials we take from our LocalStack container in Docker 😢 .
Plus I faced this bug in AWS SDK: aws/aws-sdk-java-v2#3839.

Therefore if all good with Kinesis Binder migration, I'm going to release 3.0.0-M2 next week.

@javaboy79
Copy link

Is there any update on a 3.0.0-M1 release? Did it/will it happen?

@artembilan
Copy link
Member

The 3.0.0-M1 happened long time ago, last Fall.
It is not relevant any more. Here is a blog post about a new one 3.0.0-M2: https://spring.io/blog/2023/03/27/spring-integration-for-aws-3-0-0-m2-and-spring-cloud-stream-kinesis-binder-4

Griffin1989106 added a commit to Griffin1989106/SpringWithAWS that referenced this issue Jul 22, 2024
Fixes spring-projects/spring-integration-aws#155

* Upgrade to the latest deps including Gradle
* Remove XML configuration support
* Make use of SC-AWS 3.0 SQS and SNS support in respective channel adapters
guygriffin1989106 added a commit to guygriffin1989106/SpringWithAWS that referenced this issue Jul 26, 2024
Fixes spring-projects/spring-integration-aws#155

* Upgrade to the latest deps including Gradle
* Remove XML configuration support
* Make use of SC-AWS 3.0 SQS and SNS support in respective channel adapters
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants