diff --git a/docs/qa/test_run_12_6_2022___functional_testing_mongo_db.md b/docs/qa/test_run_12_6_2022___functional_testing_mongo_db.md index 10aa24c..08038ef 100644 --- a/docs/qa/test_run_12_6_2022___functional_testing_mongo_db.md +++ b/docs/qa/test_run_12_6_2022___functional_testing_mongo_db.md @@ -4,285 +4,285 @@ **commit** 962b6c4345feaf9fa541e46e76c0772f1b7e422c -| ID | Title | Status | Comment | -| -------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| T3211709 | \----------------Required config------------------ | Passed | | -| T3211687 | The user can't create the MongoDB source connector without the ""type"" key -> the system returns an error | Passed | | -| T3211692 | The user can't create the MongoDB source connector with an empty (or an invalid format) in the ""type"" key -> the system returns an error | Passed | | -| T3211690 | Check that the user can create the MongoDB source connector only with ""type"": ""TYPE_SOURCE"" | Passed | | -| T3211688 | The user can't create the MongoDB source connector without the ""plugin"" key -> the system returns an error | Passed | | -| T3211691 | The user can't create the MongoDB source connector with an empty (or an invalid format) in the ""plugin"" key -> the system returns an error | Passed | | -| T3211693 | The user can create the MongoDB source connector with a valid the plugin format | Passed | | -| T3211689 | The user can't create the MongoDB source connector without the ""pipelineId"" key -> the system returns an error | Passed | | -| T3211694 | The user can't create the MongoDB source connector with an empty (or an invalid ID) in the ""pipelineId"" key -> the system returns an error | Passed | | -| T3211695 | The user can create the MongoDB source connector with a valid pipeline ID | Passed | | -| T3211698 | The user can't create the MongoDB source connector without the ""name"" key -> the system returns an error | Passed | | -| T3211706 | Check that the MongoDB source connector can't be created without the ""db"" key -> the system returns an error \`""db"" config value must be set\` | Passed | | -| T3211707 | The user can't create the MongoDB source connector with empty the ""db"" key -> the system returns an error \`""db"" config value must be set\` | Passed | | -| T3211708 | The user can enter any characters, letters, and digits in the ""db"" key | Passed | | -| T3211711 | The user can't create the MongoDB source connector with the long name (\`validate:""required,max=64"") in the ""db"" key -> the system returns an error \`""db"" config value is too long\` | Passed | | -| T3211712 | The system validates the db name after starting the pipeline -> the error ""database \\""example_name\\"" doesn't exist"" occurs if the db name is an invalid | Passed | | -| T3211713 | The user can create the MingoDB source connector with a valid db name (The name of a database the connector must work with.) | Passed | | -| T3211730 | Check that the MongoDB source connector can't be created without the ""collection"" key -> the system returns an error \`""collection"" config value must be set\` | Passed | | -| T3211731 | The user can't create the MongoDB source connector with empty the ""collection"" key -> the system returns an error \`""collection"" config value must be set\` | Passed | | -| T3211732 | The user can enter any characters, letters, and digits in the ""collection"" key | Passed | | -| T3211733 | The user can create the MongoDB source connector with unlimited characters in the ""collection"" key | Passed | | -| T3211734 | The system validates the collection name after starting the pipeline -> the error ""collection \\""example_collection\\"" doesn't exist"""" occurs if the collection name is an invalid | Passed | | -| T3211735 | The user can create the MingoDB source connector with a valid collection name (
The name of a collection the connector must read from.) | Passed | | -| T3211697 | The user can't create the MongoDB source connector when the Pipeline is running | Passed | | -| T3211710 | \--------------------- No required config ------------------- | Passed | | -| T3211686 | Check that the ""uri"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default is mongodb://localhost:27017) | Skipped | It isn't a connector issue

At the current time, we can't test this case

Know the issue with the MongoDB docker image that is documented on the official website (see https://www.mongodb.com/docs/ and https://www.mongodb.com/compatibility/deploying-a-mongodb-cluster-with-docker) | -| T3211705 | The user can enter any characters, letters, and digits in the ""uri"" key | Passed | | -| T3211729 | The user can create the MongoDB source connector with unlimited characters in the ""uri"" key | Passed | | -| T3211704 | The system validates the URI to connection to MongoDB after starting the pipeline -> the error occurs if the URI to connection to MongoDB is an invalid | Passed | | -| T3211696 | The user can create the MongoDB source connector with a valid URI to connect to the MongoDB (The URI can contain host names, IPv4/IPv6 literals, or an SRV record."") | Passed | | -| T3211855 | \------- batchSize config | Passed | | -| T3211699 | Check that the ""batchSize"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default the count of records in one batch is 1000) | Passed | | -| T3211700 | The user can't create the MongoDB source connector with an invalid value in the ""batchSize"" key (for example: enter text, the floating-point) -> the system is returned an error \`batchSize"" config value must be int\` | Passed | | -| T3211702 | The user can't create the MongoDB source connector with a value of more than 100000 in the ""batchSize"" key -> the system returns an error ""\\""batchSize\\"" value must be less than or equal to 100000"","" | Passed | | -| T3211703 | The user can't create the MongoDB source connector with a value of fewer than 1 in the ""batchSize"" key -> the system returns an error | Passed | | -| T3211701 | The user can create the MongoDB source connector with a valid number of records (for example 50) in the ""batchSize"" key | Passed | | -| T3211728 | The user can create the MongoDB source connector with config ""batchSize"":""+00010"" -> the system parses like ""1"" | Passed | | -| T3211771 | \------- snapshot config | Passed | | -| T3211772 | Check that the ""snapshot"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default is true -> data that was added before starting the pipeline will be transferred) | Passed | | -| T3211773 | The user can't create the MongoDB source connector with an invalid values in the ""snapshot"" key (for example enter any text or digits, except for valid values) -> the system returns an error | Passed | | -| T3211774 | The user can create the MongoDB source connector with a valid value in the ""snapshot"" key (""1"", ""t"", ""T"", ""true"", ""TRUE"", ""True"") -> data that was added before starting the pipeline will be transferred | Passed | | -| T3211775 | Create the MongoDB source with ""snapshot"":""false"" (""also 0"", ""f"", ""F"", ""false"", ""FALSE"", ""False"")-> data that was added before starting the pipeline will not be transferred (if add data during pause, data will be transferred after restarting) | Passed | | -| T3211854 | \------- orderingField config | Passed | | -| T3211849 | Check that the ""orderingField"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default is ""_id"") | Passed | | -| T3211850 | The user can enter any characters, letters, and digits in the ""orderingField"" key | Passed | | -| T3211851 | The user can create the MongoDB source connector with an unlimited number of characters in the ""orderingField"" config | Passed | | -| T3211852 | Create the Mongo DB Source with an invalid field name that will be using for ordering (for snapshot)-> the system ignores an invalid name (There will be no orderingField validation, as we can have different schemes of different collection documents) | Passed | | -| T3211853 | The user can create the MongoDB source connector with a valid name of the field which using for ordering (the user can use any existing field from the table) | Passed | | -| T3211750 | \------- auth.username config | Passed | | -| T3211721 | Check that the ""auth.username"" key isn't necessary. The user can create the MongoDB source connector without (or an empty) this key if the MongoDB isn't used a username/password authentication | Passed | | -| T3211722 | Check that the MongoDB source connector can't be created without the ""auth.username"" key (Required) if the MongoDB has an auth.username/auth.password authentication enabled -> the system returns an error after starting the pipeline | Passed | | -| T3211723 | The user can enter any characters, letters, and digits in the ""auth.username"" key | Passed | | -| T3211724 | The user can create the MongoDB source connector with unlimited characters in the ""auth.username"" key | Passed | | -| T3211725 | The system validates the auth.username for the MongoDB account after starting the pipeline -> the error occurs if the auth.username for the MongoDB account is an invalid | Passed | | -| T3211726 | 1\. The MongoDB isn't used a auth.username/auth.password -> 2. Create the MongoDB source connector with value in the ""auth.username"" key-> 3. Start the pipeline -> the MongoDB system returns an error | Passed | | -| T3265892 | 1\. MongoDB isn't used a auth.username/auth.password -> and used certificate 2. Create the MongoDB source connector with value in the ""auth.username"" key-> 3. Start the pipeline -> the MongoDB system ignores the ""auth.username""; data is transferred | Passed | | -| T3211727 | The user can create the MongoDB source connector with a valid auth.username for MongoDB account (example: some_username) | Passed | | -| T3211749 | \------- auth.password config | Passed | | -| T3211718 | Check that the ""auth.password"" key isn't necessary. The user can create the MongoDB source connector without (or an empty) this key if the MongoDB isn't used a auth.username/auth.password | Passed | | -| T3211714 | Check that the MongoDB source connector can't be created without the ""auth.password"" key (Required) if the MongoDB has an auth.username/auth.password authentication enabled -> the system returns an error after starting the pipeline | Passed | | -| T3211715 | The user can enter any characters, letters, and digits in the ""auth.password"" key | Passed | | -| T3211719 | The user can create the MongoDB source connector with unlimited characters in the ""auth.password"" key | Passed | | -| T3211716 | The system validates the auth.password for MongoDB account after starting the pipeline -> the error occurs if the password for MongoDB account is an invalid | Passed | | -| T3211720 | 1\. The MongoDB isn't used a auth.username/auth.password -> 2, Create the MongoDB source connector with value in the ""auth.password"" key-> 3. Start the pipeline -> the MongoDB system returns an error | Passed | | -| T3265897 | 1\. MongoDB isn't used a auth.username/auth.password -> and used certificate 2. Create the MongoDB source connector with value in the ""auth.password"" key-> 3. Start the pipeline -> the MongoDB system ignores the ""auth.password""; data is transferred | Passed | | -| T3211717 | The user can create the MongoDB source connector with a valid auth.password for the MongoDB account (example: some_password) | Passed | | -| T3211748 | \------- auth.db config | Passed | | -| T3211740 | Check that the ""auth.db"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default is admin) | Passed | | -| T3255927 | The MongoDB source can't be worked without ""auth.password""/""auth.username"" (or""auth.tls.certificateKeyFile""), ""auth.mechanism"" keys (Required) if enter a valid value in the ""auth.db"" config-> the system returns an error after starting the pipeline | Passed | | -| T3211736 | The user can enter any characters, letters, and digits in the ""auth.db"" key | Passed | | -| T3211737 | The user can create the MongoDB source connector with unlimited characters in the \`""auth.db"" key | Passed | | -| T3211738 | The system validates the auth.db name after starting the pipeline -> the system ignores this field and uses the default db name if the db name is an invalid; data is transferred | Passed | | -| T3211739 | The user can create the MingoDB source connector with a valid auth.db name ( The name of a database that contains the user's authentication data.) (We can use only ""admin"" auth.db name for free version) | Passed | | -| T3211747 | \------- auth.mechanism config | Passed | | -| T3211741 | Check that the ""auth.mechanism"" key isn't necessary. The user can create the MongoDB source connector without this key if the MongoDB isn't used auth.tlsCAFile/auth.tlsCertificateKeyFile | Passed | | -| T3211746 | The MongoDB source can't be worked without the ""auth.mechanism"" key (Required; need to use MONGODB-X509) if the MongoDB has an auth.tlsCertificateKeyFile-> the system returns an error after starting the pipeline | Passed | | -| T3211861 | The user can create MongoDB source without the ""auth.mechanism"" if MongoDB has auth.username/auth.password (The default mechanism that depends on your MongoDB server version -https://www.mongodb.com/docs/drivers/go/current/fundamentals/auth/#default) | Passed | | -| T3211742 | The user can't create the MongoDB source connector with an invalid value in the ""auth.mechanism"" key (for example enter any text or digets) -> the system returns an error | Passed | | -| T3211743 | The user can create the MongoDB source connector with lowercase value (scram-sha-256, scram-sha-1, mongodb-cr, mongodb-aws, mongodb-x509) in the ""auth.mechanis"" key -> the system transforms value to uppercase; data is transferred after sterting | Passed | | -| T3211744 | 1.MongoDB dosen't use auth.username/auth.password (or auth.tlsCAFile/auth.tlsCertificateKeyFile)-> 2.Enter valid value in the ""auth.mechanism"" config and create the MongoDB source connector -> the system returns an error after starting the pipeline | Passed | | -| T3211745 | The user can create the MongoDB source connector with a valid auth.mechanism (supported mechanisms: - password/username:SCRAM-SHA-256 SCRAM-SHA-1, MONGODB-CR, MONGODB-AWS; - certificates: MONGODB-X509) | Passed | | -| T3211751 | \------- auth.tlsCAFile config | Passed | | -| T3211752 | Check that the ""auth.tlsCAFile"" key isn't necessary. The user can create the MongoDB source connector without (or an empty) this key if the MongoDB doesn't use bundle of certificate authorities to trust when making a TLS connection | Passed | | -| T3211753 | Check that the Mongo source connector can't be worked without the ""auth.tlsCAFile"" key (Required) if the MongoDB uses a bundle of certificate authorities to trust when making a TLS connection -> the system returns an error after starting the pipeline | Skipped | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | -| T3211757 | The user can't create the MongoDB source connector with an invalid path in the ""auth.tlsCAFile"" key -> the system returns an error | Passed | | -| T3211756 | The system returns an error after running the pipeline if the user created the MongoDB source connector with invalid auth.tlsCAFile file | Passed | | -| T3211760 | MongoDB doesn't use Certificates. Create the MongoDB source with the next configs: 1. Enter valid value in ""auth.tlsCAFile"" & ""auth.tlsCertificateKeyFile""-> 2. Enter ""auth.mechanism"":""MONGODB-X509"" -> the system returns an error after starting | Skipped | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | -| T3211758 | The user can create the MongoDB source connector with a valid auth.tlsCAFile and file path | Skipped | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | -| T3211761 | \------- auth.tlsCertificateKeyFile config | Passed | | -| T3211762 | Check that the ""auth.tlsCertificateKeyFile"" key isn't necessary. The user can create the MongoDB source connector without (or an empty) this key if the MongoDB doesn't use the client certificate file or the client private key file | Passed | | -| T3211763 | Check that the Mongo source connector can't be worked without the ""auth.tlsCertificateKeyFile"" key (Required) if the MongoDB uses the client certificate file or the client private key file-> the system returns an error after starting the pipeline | Passed | | -| T3211767 | The user can't create the MongoDB source connector with an invalid path in the ""auth.tlsCertificateKeyFile"" key -> the system returns an error | Passed | | -| T3211768 | The system returns an error after running the pipeline if the user created the MongoDB source connector with invalid auth.tlsCertificateKeyFile | Passed | | -| T3211769 | MongoDB doesn't use Certificates. Create the MongoDB source with the next configs: 1. Enter valid value in ""auth.tlsCAFile"" & ""auth.tlsCertificateKeyFile""-> 2. Enter ""auth.mechanism"":""MONGODB-X509"" -> the system returns an error after starting | Passed | | -| T3265985 | 1\. MongoDB is used a auth.username/auth.password -> and isn't used certificate 2. Create the MongoDB source with value in the ""auth.tls.certificateKeyFile""-> 3. Start the pipeline -> the MongoDB system ignores the certificate; data is transferred | Passed | | -| T3211770 | The user can create the MongoDB source connector with a valid auth.tlsCertificateKeyFile and file path | Passed | | -| T3211776 | \----------------Required config------------------ | Passed | | -| T3211777 | The user can't create the MongoDB destination connector without the ""type"" key -> the system returns an error | Passed | | -| T3211778 | The user can't create the MongoDB destination connector with an empty (or an invalid format) in the ""type"" key -> the system returns an error | Passed | | -| T3211779 | Check that the user can create the MongoDB destination connector only with ""type"": ""TYPE_DESTINATION | Passed | | -| T3211780 | The user can't create the MongoDB destination connector without the ""plugin"" key -> the system returns an error | Passed | | -| T3211781 | The user can't create the MongoDB destination connector with an empty (or an invalid format) in the ""plugin"" key -> the system returns an error | Passed | | -| T3211782 | The user can create the MongoDB destination connector with a valid the plugin format | Passed | | -| T3211783 | The user can't create the MongoDB destination connector without the ""pipelineId"" key -> the system returns an error | Passed | | -| T3211784 | The user can't create the MongoDB destination connector with an empty (or an invalid ID) in the ""pipelineId"" key -> the system returns an error | Passed | | -| T3211785 | The user can create the MongoDB destination connector with a valid pipeline ID | Passed | | -| T3211786 | The user can't create the MongoDB destination connector without the ""name"" key -> the system returns an error | Passed | | -| T3211787 | Check that the MongoDB destination connector can't be created without the ""db"" key -> the system returns an error \`""db"" config value must be set\` | Passed | | -| T3211788 | The user can't create the MongoDB destination connector with empty the ""db"" key -> the system returns an error \`""db"" config value must be set\` | Passed | | -| T3211789 | The user can enter any characters, letters, and digits in the ""db"" key | Passed | | -| T3211790 | The user can't create the MongoDB destination connector with the long name (\`validate:""required,max=64"") in the ""db"" key -> the system returns an error \`""db"" config value is too long\` | Passed | | -| T3211791 | The system validates the db name after starting the pipeline -> the error ""database \\""example_name\\"" doesn't exist"" occurs if the db name is an invalid | Passed | | -| T3211792 | The user can create the MingoDB destination connector with a valid db name (The name of a database the connector must work with.) | Passed | | -| T3211793 | Check that the MongoDB destination connector can't be created without the ""collection"" key -> the system returns an error \`""collection"" config value must be set\` | Passed | | -| T3211794 | The user can't create the MongoDB destination connector with empty the ""collection"" key -> the system returns an error \`""collection"" config value must be set\` | Passed | | -| T3211795 | The user can enter any characters, letters, and digits in the ""collection"" key | Passed | | -| T3211796 | The user can create the MongoDB destination connector with unlimited characters in the ""collection"" key | Passed | | -| T3211797 | The system validates the collection name after starting the pipeline -> the error ""collection \\""example_collection\\"" doesn't exist"""" occurs if the collection name is an invalid | Passed | | -| T3211798 | The user can create the MingoDB destination connector with a valid collection name (
The name of a collection the connector must write to.) | Passed | | -| T3211799 | The user can't create the MongoDB destination connector when the Pipeline is running | Passed | | -| T3211800 | \--------------------- No required config ------------------- | Passed | | -| T3211856 | Check that the ""uri"" key isn't necessary and the user can create the MongoDB destination connector without (or an empty) this key (By default is mongodb://localhost:27017) | Skipped | It isn't a connector issue

At the current time, we can't test this case

Know the issue with the MongoDB docker image that is documented on the official website (see https://www.mongodb.com/docs/ and https://www.mongodb.com/compatibility/deploying-a-mongodb-cluster-with-docker) | -| T3211857 | The user can enter any characters, letters, and digits in the ""uri"" key | Passed | | -| T3211858 | The user can create the MongoDB destination connector with unlimited characters in the ""uri"" key | Passed | | -| T3211859 | The system validates the URI to connection to MongoDB after starting the pipeline -> the error occurs if the URL to connection to MongoDB is an invalid | Passed | | -| T3211860 | The user can create the MongoDB destination connector with a valid URI to connect to the MongoDB (The URI can contain host names, IPv4/IPv6 literals, or an SRV record."") | Passed | | -| T3211801 | \------- auth.username config | Passed | | -| T3211802 | Check that the ""auth.username"" key isn't necessary. The user can create the MongoDB destination connector without (or an empty) this key if the MongoDB isn't used a username/password authentication | Passed | | -| T3211803 | Check that the MongoDB destination connector can't be created without the ""auth.username"" key (Required) if the MongoDB has an auth.username/auth.password authentication enabled -> the system returns an error after starting the pipeline | Passed | | -| T3211804 | The user can enter any characters, letters, and digits in the ""auth.username"" key | Passed | | -| T3211805 | The user can create the MongoDB destination connector with unlimited characters in the ""auth.username"" key | Passed | | -| T3211806 | The system validates the auth.username for the MongoDB account after starting the pipeline -> the error occurs if the auth.username for the MongoDB account is an invalid | Passed | | -| T3211807 | 1\. The MongoDB isn't used a auth.username/auth.password -> 2. Create the MongoDB destination connector with value in the ""auth.username"" key-> 3. Start the pipeline -> the MongoDB system returns an error | Passed | | -| T3265907 | 1\. MongoDB isn't used a auth.username/auth.password -> and used certificate 2. Create the MongoDB dest connector with value in the ""auth.username"" key-> 3. Start the pipeline -> the MongoDB system ignores the ""auth.username""; data is transferred | Passed | | -| T3211808 | The user can create the MongoDB destination connector with a valid auth.username for MongoDB account (example: some_username) | Passed | | -| T3211809 | \------- auth.password config | Passed | | -| T3211810 | Check that the ""auth.password"" key isn't necessary. The user can create the MongoDB destination connector without (or an empty) this key if the MongoDB isn't used a auth.username/auth.password | Passed | | -| T3211811 | Check that the MongoDB destination connector can't be created without the ""auth.password"" key (Required) if the MongoDB has an auth.username/auth.password authentication enabled -> the system returns an error after starting the pipeline | Passed | | -| T3211812 | The user can enter any characters, letters, and digits in the ""auth.password"" key | Passed | | -| T3211813 | The user can create the MongoDB destination connector with unlimited characters in the ""auth.password"" key | Passed | | -| T3211814 | The system validates the auth.password for MongoDB account after starting the pipeline -> the error occurs if the password for MongoDB account is an invalid | Passed | | -| T3211815 | 1\. The MongoDB isn't used a auth.username/auth.password -> 2, Create the MongoDB destination connector with value in the ""auth.password"" key-> 3. Start the pipeline -> the MongoDB system returns an error | Passed | | -| T3265902 | 1\. MongoDB isn't used a auth.username/auth.password -> and used certificate 2. Create the MongoDB dest connector with value in the ""auth.password"" key-> 3. Start the pipeline -> the MongoDB system ignores the ""auth.password""; data is transferred | Passed | | -| T3211816 | The user can create the MongoDB destination connector with a valid auth.password for the MongoDB account (example: some_password) | Passed | | -| T3211817 | \------- auth.db config | Passed | | -| T3211818 | Check that the ""auth.db"" key isn't necessary and the user can create the MongoDB destination connector without (or an empty) this key (By default is admin) | Passed | | -| T3255932 | The MongoDB source can't be worked without ""auth.password""/""auth.username"" (or""auth.tls.certificateKeyFile""), ""auth.mechanism"" keys (Required) if enter a valid value in the ""auth.db"" config-> the system returns an error after starting the pipeline | Passed | | -| T3211819 | The user can enter any characters, letters, and digits in the ""auth.db"" key | Passed | | -| T3211820 | The user can create the MongoDB destination connector with unlimited characters in the \`""auth.db"" key | Passed | | -| T3211821 | The system validates the auth.db name after starting the pipeline -> the system ignores this field and uses the default db name if the db name is an invalid; data is transferred | Passed | | -| T3211822 | The user can create the MongoDB destination connector with a valid auth.db name (
The name of a database that contains the user's authentication data.) | Passed | | -| T3211823 | \------- auth.mechanism config | Passed | | -| T3211824 | Check that the ""auth.mechanism"" key isn't necessary. The user can create the MongoDB destination connector without this key if the MongoDB isn't used auth.tlsCAFile/auth.tlsCertificateKeyFile | Passed | | -| T3211862 | The MongoDB destination can't be worked without the ""auth.mechanism"" key (Required; need to use MONGODB-X509) if the MongoDB has an auth.tlsCertificateKeyFile -> the system returns an error after starting the pipeline | Passed | | -| T3211863 | The user can create MongoDB dest without the ""auth.mechanism"" if MongoDB has auth.username/auth.password (The default mechanism that depends on your MongoDB server version -https://www.mongodb.com/docs/drivers/go/current/fundamentals/auth/#default) | Passed | | -| T3211825 | The user can't create the MongoDB destination connector with an invalid value in the ""auth.mechanism"" key (for example enter any text or digets) -> the system returns an error | Passed | | -| T3211826 | The user can create the MongoDB Destination with lowercase value (scram-sha-256, scram-sha-1, mongodb-cr, mongodb-aws, mongodb-x509) in the ""auth.mechanis"" key -> the system transforms value to uppercase; data is transferred after starting | Passed | | -| T3211827 | 1.MongoDB dosen't use auth.username/auth.password (or auth.tlsCAFile/auth.tlsCertificateKeyFile)-> 2.Enter valid value in the ""auth.mechanism"" config and create the MongoDB destination -> the system returns an error after starting the pipeline | Passed | | -| T3211828 | The user can create the MongoDB destination connector with a valid auth.mechanism (supported mechanisms: - password/username:SCRAM-SHA-256 SCRAM-SHA-1, MONGODB-CR, MONGODB-AWS; - certificates: MONGODB-X509) | Passed | | -| T3211829 | \------- auth.tlsCAFile config | Passed | | -| T3211830 | Check that the ""auth.tlsCAFile"" key isn't necessary. The user can create the MongoDB destination connector without (or an empty) this key if the MongoDB doesn't use a bundle of certificate authorities to trust when making a TLS connection | Passed | | -| T3211831 | Check that the Mongo Dest connector can't be worked without the ""auth.tlsCAFile"" key (Required) if the MongoDB uses a bundle of certificate authorities to trust when making a TLS connection -> the system returns an error after starting the pipeline | Skipped | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB

| -| T3211835 | The user can't create the MongoDB destination connector with an invalid path in the ""auth.tlsCAFile"" key -> the system returns an error | Passed | | -| T3211836 | The system returns an error after running the pipeline if the user created the MongoDB destination connector with invalid auth.tlsCAFile file | Passed | | -| T3211837 | MongoDB doesn't use Certificates. Create the MongoDB destination with the configs: 1. Enter valid value in ""auth.tlsCAFile"" & ""auth.tlsCertificateKeyFile""-> 2. Enter ""auth.mechanism"":MONGODB-X509"" -> the system returns an error after starting | Skipped | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | -| T3211838 | The user can create the MongoDB destination connector with a valid auth.tlsCAFile and file path | Skipped | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | -| T3211839 | \------- auth.tlsCertificateKeyFile config | Passed | | -| T3211840 | Check that the ""auth.tlsCertificateKeyFile"" key isn't necessary. The user can create the MongoDB destination connector without (or an empty) this key if the MongoDB doesn't use the client certificate file or the client private key file | Passed | | -| T3211841 | Check that the Mongo destinatio connector can't be worked without the ""auth.tlsCertificateKeyFile"" key (Required) if the MongoDB uses the client certificate file or the client private key file-> the system returns an error after starting the pipeline | Passed | | -| T3211845 | The user can't create the MongoDB destination connector with an invalid path in the ""auth.tlsCertificateKeyFile"" key -> the system returns an error | Passed | | -| T3211846 | The system returns an error after running the pipeline if the user created the MongoDB destination connector with invalid auth.tlsCertificateKeyFile | Passed | | -| T3211847 | MongoDB doesn't use Certificates. Create the MongoDB destination with the next configs: 1. Enter valid value in ""auth.tlsCAFile"" & ""auth.tlsCertificateKeyFile""-> 2. Enter ""auth.mechanism"":""MONGODB-X509"" -> the system returns an error after starting | Passed | | -| T3265980 | 1\. MongoDB is used a auth.username/auth.password -> and isn't used certificate 2. Create the MongoDB dest with value in the ""auth.tls.certificateKeyFile""-> 3. Start the pipeline -> the MongoDB system ignores the certificate; data is transferred | Passed | | -| T3211848 | The user can create the MongoDB destination connector with a valid auth.tlsCertificateKeyFile and file path | Passed | | -| T3211865 | The data from the MongoDB source connector is transferred to the Postgres Destination connector | Passed | | -| T3211866 | The data from the MongoDB source connector is transferred to the Materialize Destination connector | Passed | | -| T3211889 | The data from the MongoDB source connector is transferred to the NATS Pub/Sub Destination connector | Passed | | -| T3211893 | The data from the MongoDB source connector is transferred to the Vitess Destination connector | Passed | | -| T3211907 | The data from the MongoDB source connector is transferred to the Clickhouse Destination connector | Passed | | -| T3211908 | The data from the MongoDB source connector is transferred to the HubSpot Destination connector | Passed | | -| T3211909 | The data from the MongoDB source connector is transferred to the SQL Server Destination connector | Passed | | -| T3211965 | The data from the MongoDB source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211867 | Data from two different MongoDB source connectors are transferred to the Postgres Destination connector | Passed | | -| T3211868 | Data from two different MongoDB source connectors are transferred to the Materialize Destination connector | Passed | | -| T3211890 | Data from two different MongoDB source connectors are transferred to the NATS PubSub Destination connector | Passed | | -| T3211894 | Data from two different MongoDB source connectors are transferred to the Vitess Destination connector | Passed | | -| T3211910 | Data from two different MongoDB source connectors are transferred to the Clickhouse Destination connector | Passed | | -| T3211911 | Data from two different MongoDB source connectors are transferred to the SQL Server Destination connector | Passed | | -| T3211966 | Data from two different MongoDB source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3211869 | The user can transfer data using one pipeline from two (or more) different MongoDB sources to two (or more) Destinations: Materialize, S3, File, Kafka, NATS Pub/Subp, NATS JS, GCP Pub/Sub and other | Passed | | -| T3211870 | Try to transfer data from two MongoDB Sources with the same config to the Postgres Destination -> data is transferred and isn't duplicated | Passed | | -| T3211871 | Try to transfer data from two MongoDB Sources with the same config to the Materialize Destination - > data is transferred and duplicated | Passed | | -| T3211891 | Try to transfer data from two MongoDB Sources with the same config to the NATS PubSub Destination - > data is transferred and duplicated | Passed | | -| T3211884 | Try to transfer data from two MongoDB Sources with the same config to the SQL Server Destination - > the data is recorded and duplicated (an error occurs if SQL Server table with a primary key) | Passed | | -| T3211895 | Try to transfer data from two MongoDB Sources with the same config to the Vitess Destination - > data is recorded and is duplicated (data isn't duplicated if used a primary key) | Passed | | -| T3211912 | Try to transfer data from two MongoDB Server Sources with the same config to the Clickhouse Destination - > data is transferred and is duplicated in the Clickhouse table (data isn't duplicated if the primary key is used in the Clickhouse table) | Passed | | -| T3211967 | Try to transfer data from two MongoDB Sources with the same config to the MongoDB Destination - > data is transferred and duplicated (is transferred once and an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211913 | Try to transfer data from two MongoDB Sources with the same config to the HubSpot Destination - > data is transferred (data is recorded once and an error occurs for resourse ""cms.blogs.authors"") | Passed | | -| T3211872 | 1.Stop the pipeline and remove MongoDB Source-> 2. create MongoDB Source with the same config from the 1 step-> 3. Start pipeline -> data is recorded to the destination (add.info: data in Postgres, Vitess, and DB2 is refreshed if used primary key) | Passed | | -| T3211873 | 1\. Create one MongoDB Source -> 2. Create two Postgres Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated (data isn't duplicated if used primary key) | Passed | | -| T3211874 | 1\. Create one MongoDB Source -> 2. Create two Materialize Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated | Passed | | -| T3211892 | 1\. Create one MongoDB Source -> 2. Create two NATS PubSub Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated | Passed | | -| T3211896 | 1\. Create one MongoDB Source -> 2. Create two Vitess Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated (data isn't duplicated if used primary key) | Passed | | -| T3211914 | 1\. Create one MongoDB Source -> 2. Create two SQL Server Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated (an error occurs if SQL Server table with Primary Key) | Passed | | -| T3211915 | 1\. Create one MongoDB Source -> 2. Create two HubSpot Destination connectors with the same config -> 3. Start the pipeline -> data is recorded and duplicated (data is recorded once and an error occurs for resourse ""cms.blogs.authors"") | Passed | | -| T3211916 | 1\. Create one MongoDB Source -> 2. Create two Clickhouse Destination connectors with the same config -> 3. Start the pipeline -> data is transferred and is duplicated in Clickhouse table | Passed | | -| T3211968 | 1\. Create one MongoDB Source -> 2. Create two MongoDB Destination connectors with the same config-> 3. Start the pipeline -> data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211875 | 1.Create MongoDB Source->2. Create Destination-> 3. Start the pipeline-> the data is recorded to the Destination-> 4. Stop the pipeline and add the new Destination -> 5. Start the pipeline -> the previous data isn't recorded to the new Destination | Passed | | -| T3211876 | The new data that is added to the MongoDB table is transferred to the Destination while the pipeline is running (or after restarting the pipeline ) | Passed | | -| T3211877 | Check that error occurs after trying transferred data from the MongoDB table which isn't used in the Vitess table | Passed | | -| T3211885 | Check that error occurs after trying transferred data from the MongoDB table which isn't used in the Postgres table | Passed | | -| T3211886 | Check that error occurs after trying transferred data from the MongoDB table which isn't used Materialize table | Passed | | -| T3211917 | Check that error occurs after trying transferred data from the MongoDB table which isn't used the SQL Server table | Passed | | -| T3211918 | Check that error occurs after trying transferred data from the MongoDB table which isn't used the Clickhouse table | Passed | | -| T3211919 | Check that error occurs after trying transferred data from the MongoDB table which isn't used the HubSpot table | Passed | | -| T3211969 | Check that error occurs after trying transferred data from the MongoDB table which isn't used the MongoDB table | Passed | | -| T3211878 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from Postgres table | Passed | | -| T3211887 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from Materialize table | Passed | | -| T3211898 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from Vitess table | Passed | | -| T3211920 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from SQL Server table | Passed | | -| T3211921 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data isn't removed from the Clickhouse table ( https://clickhouse.com/docs/en/engines/table-engines/#log) | Passed | | -| T3211923 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from the Clickhouse table | Passed | | -| T3211905 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data isn't removed from the HubSpot; data is duplicated that was removed from MongoDB | Passed | | -| T3211970 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from MongoDB table | Passed | | -| T3211879 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in Postgres table | Passed | | -| T3211888 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in Materialize table | Passed | | -| T3211899 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in Vitess table (update doesn't work for the Vitess table that isn't used primaryKey; updated data is recorded as new and previous data isn't updated) | Passed | | -| T3211906 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in SQL Server table | Passed | | -| T3211925 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in data is updated in the Clickhouse table | Passed | | -| T3211926 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data isn't updated in the Clickhouse table ( https://clickhouse.com/docs/en/engines/table-engines/#log) | Passed | | -| T3211927 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data isn't updated in the HubSpot; old data is duplicated that was updated from MongoDB | Passed | | -| T3211971 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in MongoDB table | Passed | | -| T3211864 | 1\. The user creates the MongoDB source and destination connectors with the identical config -> 2. Start the pipeline -> data isn't transferred; an error ""E11000 duplicate key error"" occurs | Passed | | -| T3211880 | The user can use the MongoDB Source connector with the identical config in two (or more) pipelines at one time | Passed | | -| T3211897 | \-------- the MongoDB Source connector with not required config ---------------------- | Passed | | -| T3211881 | 1\. Create the MongoDB Source connector with {""batchSize"":""1""} -> 2. crate the Destination connector -> 3. Start the pipeline -> data is transferred | Passed | | -| T3211882 | 1\. Create the MongoDB Source connector with {""batchSize"":""100000""} -> 2. crate the Destination connector -> 3. Start the pipeline -> data is transferred | Passed | | -| T3211900 | 1.Create the Mongo source with {""OrderingColumn"":""id""}-> 2. Start pipeline-> data is transferred by ""id"" ordering (2;4;15;4)-> 3. add data ""id"":""1"" in Mongo-> data id:15; 1 is transferred-> 4. add data with ""id"":""16"" in Mongo-> data id:16 transfers | Passed | | -| T3211972 | 1\. Create the MongoDB source connector with {""snapshot"":""true""}-> 2. create the Destination-connector-> 3. start the pipeline -> data that was added before starting the pipeline will be transferred | Passed | | -| T3211973 | 1\. Create the MongoDB source connector with {""snapshot"":""false""}-> 2. create the Destination connector -> 3. start the pipeline -> data that was added before starting the pipeline will not be transferred | Passed | | -| T3211928 | The data from the Stripe source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211929 | The data from the Snowflake source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211930 | The data from the NATS PubSub source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211931 | The data from the Oracle source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211954 | The data from the Vitess source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211957 | The data from the SQL Server source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211958 | The data from the HubSpot source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211959 | The data from the Clickhouse source connector is transferred to the MongoDB Destination connector | Passed | | -| T3218670 | The data from the MongoDB source connector is transferred to the MongoDB Destination connector | Passed | | -| T3211932 | Data from two different Stripe source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3211933 | Data from two different Snowflake source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3211951 | Data from two different NATS PubSub source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3211955 | Data from two different Oracle source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3211952 | Data from two different Vitess source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3223164 | Data from two different Clickhouse source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3211960 | Data from two different SQL Server source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3211961 | Data from two different MongoDB source connectors are transferred to the MongoDB Destination connector | Passed | | -| T3211934 | Transfer data from the Stripe Sources to two MongoDB Destination with the same config - >data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211935 | Transfer data from the Snowflake Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211936 | Transfer data from the NATS PubSub Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211953 | Transfer data from the Vitess Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211956 | Transfer data from the Oracle Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211962 | Transfer data from the Clickhouse Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211963 | Transfer data from the HubSpot Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211964 | Transfer data from the SQL Server Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211974 | Transfer data from the MongoDB Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once and an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211937 | Try to transfer data from two Source connectors with the same config to the MongoDB Destination - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211938 | 1\. Create the Source and MongoDB Dest-> 2. Start the pipeline -> the data is recorded to MongoDB Dest-> 3. Stop the pipeline and add the new MongoDB Dest -> 4. Start the pipeline -> previous data isn't recorded to the new MongoDB Destination | Passed | | -| T3211939 | 1\. Stop the pipeline and remove the Source-> 2. create the Source with the same config from 1st step -> 3. Start pipeline -> data is duplicated to MongoDB Dest (isn't transferred and an error occurs if the ""_id"" field is transferred in the payload) | Passed | | -| T3211940 | The user can transfer data using the one pipeline from two (or more) different sources: Stripe, S3, File, Kafka, NATS JS, NATS P/S, HTTP, GCP P/S to two (or more) different MongoDB Destination | Passed | | -| T3211941 | The new data that is added to the Source connector is transferred to the MongoDB Destination while the pipeline is running (or after restarting the pipeline ) | Passed | | -| T3211942 | Check that error occurs after trying to transfer data from the Source connector which isn't used in the MongoDB table | Passed | | -| T3211943 | Remove data in Stripe (or Vitess) while running pipeline (or restart run pipeline) -> data is removed from the MongoDB table | Passed | | -| T3211944 | Update data in Stripe (or Vitess) while running pipeline (or restart run pipeline) -> data is updated in the MongoDB table | Passed | | -| T3211945 | Check that data is transferred from different resources of the Stripe connector (resources list https://github.com/ConduitIO/conduit-connector-stripe/tree/main/models/resources ) to the MongoDB Destination | Passed | | -| T3211946 | Transfer data with column names in the upper case (example: {""CITY"":""Paris""}) -> the data is transferred to the MongoDB Destination PAY ATTENTION: column names must be the same case in Source and MongoDB Destination or an error will occur) | Passed | | -| T3211947 | Transfer data with column names in the lower case (example: {""city"":""Paris""}) -> the data is transferred to the MongoDB Destination PAY ATTENTION: column names must be the same case in Source and MongoDB Destination or an error will occur) | Passed | | -| T3211948 | Transfer data with column names in the camel case (example: {""CiTy"":""Paris""}) -> the data is transferred to the MongoDB Destination PAY ATTENTION: column names must be the same case in Source and MongoDB Destination or an error will occur) | Passed | | -| T3211950 | 1\. Create the MongoDB source connector -> 2. Create the MongoDB destination connector with the same config as the MongoDB source -> 3. Start the pipeline -> data isn't transmitted; an error ""E11000 duplicate key error"" occurs | Passed | | -| T3211949 | The user can use the MongoDB Destination connector with the identical config in two (or more) pipelines at one time | Passed | | +| ID | Title | Comment | Status | +| -------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------- | +| T3211709 | \----------------Required config------------------ | | Passed | +| T3211687 | The user can't create the MongoDB source connector without the ""type"" key -> the system returns an error | | Passed | +| T3211692 | The user can't create the MongoDB source connector with an empty (or an invalid format) in the ""type"" key -> the system returns an error | | Passed | +| T3211690 | Check that the user can create the MongoDB source connector only with ""type"": ""TYPE_SOURCE"" | | Passed | +| T3211688 | The user can't create the MongoDB source connector without the ""plugin"" key -> the system returns an error | | Passed | +| T3211691 | The user can't create the MongoDB source connector with an empty (or an invalid format) in the ""plugin"" key -> the system returns an error | | Passed | +| T3211693 | The user can create the MongoDB source connector with a valid the plugin format | | Passed | +| T3211689 | The user can't create the MongoDB source connector without the ""pipelineId"" key -> the system returns an error | | Passed | +| T3211694 | The user can't create the MongoDB source connector with an empty (or an invalid ID) in the ""pipelineId"" key -> the system returns an error | | Passed | +| T3211695 | The user can create the MongoDB source connector with a valid pipeline ID | | Passed | +| T3211698 | The user can't create the MongoDB source connector without the ""name"" key -> the system returns an error | | Passed | +| T3211706 | Check that the MongoDB source connector can't be created without the ""db"" key -> the system returns an error \`""db"" config value must be set\` | | Passed | +| T3211707 | The user can't create the MongoDB source connector with empty the ""db"" key -> the system returns an error \`""db"" config value must be set\` | | Passed | +| T3211708 | The user can enter any characters, letters, and digits in the ""db"" key | | Passed | +| T3211711 | The user can't create the MongoDB source connector with the long name (\`validate:""required,max=64"") in the ""db"" key -> the system returns an error \`""db"" config value is too long\` | | Passed | +| T3211712 | The system validates the db name after starting the pipeline -> the error ""database \\""example_name\\"" doesn't exist"" occurs if the db name is an invalid | | Passed | +| T3211713 | The user can create the MingoDB source connector with a valid db name (The name of a database the connector must work with.) | | Passed | +| T3211730 | Check that the MongoDB source connector can't be created without the ""collection"" key -> the system returns an error \`""collection"" config value must be set\` | | Passed | +| T3211731 | The user can't create the MongoDB source connector with empty the ""collection"" key -> the system returns an error \`""collection"" config value must be set\` | | Passed | +| T3211732 | The user can enter any characters, letters, and digits in the ""collection"" key | | Passed | +| T3211733 | The user can create the MongoDB source connector with unlimited characters in the ""collection"" key | | Passed | +| T3211734 | The system validates the collection name after starting the pipeline -> the error ""collection \\""example_collection\\"" doesn't exist"""" occurs if the collection name is an invalid | | Passed | +| T3211735 | The user can create the MingoDB source connector with a valid collection name (
The name of a collection the connector must read from.) | | Passed | +| T3211697 | The user can't create the MongoDB source connector when the Pipeline is running | | Passed | +| T3211710 | \--------------------- No required config ------------------- | | Passed | +| T3211686 | Check that the ""uri"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default is mongodb://localhost:27017) | It isn't a connector issue

At the current time, we can't test this case

Know the issue with the MongoDB docker image that is documented on the official website (see https://www.mongodb.com/docs/ and https://www.mongodb.com/compatibility/deploying-a-mongodb-cluster-with-docker) | Skipped | +| T3211705 | The user can enter any characters, letters, and digits in the ""uri"" key | | Passed | +| T3211729 | The user can create the MongoDB source connector with unlimited characters in the ""uri"" key | | Passed | +| T3211704 | The system validates the URI to connection to MongoDB after starting the pipeline -> the error occurs if the URI to connection to MongoDB is an invalid | | Passed | +| T3211696 | The user can create the MongoDB source connector with a valid URI to connect to the MongoDB (The URI can contain host names, IPv4/IPv6 literals, or an SRV record."") | | Passed | +| T3211855 | \------- batchSize config | | Passed | +| T3211699 | Check that the ""batchSize"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default the count of records in one batch is 1000) | | Passed | +| T3211700 | The user can't create the MongoDB source connector with an invalid value in the ""batchSize"" key (for example: enter text, the floating-point) -> the system is returned an error \`batchSize"" config value must be int\` | | Passed | +| T3211702 | The user can't create the MongoDB source connector with a value of more than 100000 in the ""batchSize"" key -> the system returns an error ""\\""batchSize\\"" value must be less than or equal to 100000"","" | | Passed | +| T3211703 | The user can't create the MongoDB source connector with a value of fewer than 1 in the ""batchSize"" key -> the system returns an error | | Passed | +| T3211701 | The user can create the MongoDB source connector with a valid number of records (for example 50) in the ""batchSize"" key | | Passed | +| T3211728 | The user can create the MongoDB source connector with config ""batchSize"":""+00010"" -> the system parses like ""1"" | | Passed | +| T3211771 | \------- snapshot config | | Passed | +| T3211772 | Check that the ""snapshot"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default is true -> data that was added before starting the pipeline will be transferred) | | Passed | +| T3211773 | The user can't create the MongoDB source connector with an invalid values in the ""snapshot"" key (for example enter any text or digits, except for valid values) -> the system returns an error | | Passed | +| T3211774 | The user can create the MongoDB source connector with a valid value in the ""snapshot"" key (""1"", ""t"", ""T"", ""true"", ""TRUE"", ""True"") -> data that was added before starting the pipeline will be transferred | | Passed | +| T3211775 | Create the MongoDB source with ""snapshot"":""false"" (""also 0"", ""f"", ""F"", ""false"", ""FALSE"", ""False"")-> data that was added before starting the pipeline will not be transferred (if add data during pause, data will be transferred after restarting) | | Passed | +| T3211854 | \------- orderingField config | | Passed | +| T3211849 | Check that the ""orderingField"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default is ""_id"") | | Passed | +| T3211850 | The user can enter any characters, letters, and digits in the ""orderingField"" key | | Passed | +| T3211851 | The user can create the MongoDB source connector with an unlimited number of characters in the ""orderingField"" config | | Passed | +| T3211852 | Create the Mongo DB Source with an invalid field name that will be using for ordering (for snapshot)-> the system ignores an invalid name (There will be no orderingField validation, as we can have different schemes of different collection documents) | | Passed | +| T3211853 | The user can create the MongoDB source connector with a valid name of the field which using for ordering (the user can use any existing field from the table) | | Passed | +| T3211750 | \------- auth.username config | | Passed | +| T3211721 | Check that the ""auth.username"" key isn't necessary. The user can create the MongoDB source connector without (or an empty) this key if the MongoDB isn't used a username/password authentication | | Passed | +| T3211722 | Check that the MongoDB source connector can't be created without the ""auth.username"" key (Required) if the MongoDB has an auth.username/auth.password authentication enabled -> the system returns an error after starting the pipeline | | Passed | +| T3211723 | The user can enter any characters, letters, and digits in the ""auth.username"" key | | Passed | +| T3211724 | The user can create the MongoDB source connector with unlimited characters in the ""auth.username"" key | | Passed | +| T3211725 | The system validates the auth.username for the MongoDB account after starting the pipeline -> the error occurs if the auth.username for the MongoDB account is an invalid | | Passed | +| T3211726 | 1\. The MongoDB isn't used a auth.username/auth.password -> 2. Create the MongoDB source connector with value in the ""auth.username"" key-> 3. Start the pipeline -> the MongoDB system returns an error | | Passed | +| T3265892 | 1\. MongoDB isn't used a auth.username/auth.password -> and used certificate 2. Create the MongoDB source connector with value in the ""auth.username"" key-> 3. Start the pipeline -> the MongoDB system ignores the ""auth.username""; data is transferred | | Passed | +| T3211727 | The user can create the MongoDB source connector with a valid auth.username for MongoDB account (example: some_username) | | Passed | +| T3211749 | \------- auth.password config | | Passed | +| T3211718 | Check that the ""auth.password"" key isn't necessary. The user can create the MongoDB source connector without (or an empty) this key if the MongoDB isn't used a auth.username/auth.password | | Passed | +| T3211714 | Check that the MongoDB source connector can't be created without the ""auth.password"" key (Required) if the MongoDB has an auth.username/auth.password authentication enabled -> the system returns an error after starting the pipeline | | Passed | +| T3211715 | The user can enter any characters, letters, and digits in the ""auth.password"" key | | Passed | +| T3211719 | The user can create the MongoDB source connector with unlimited characters in the ""auth.password"" key | | Passed | +| T3211716 | The system validates the auth.password for MongoDB account after starting the pipeline -> the error occurs if the password for MongoDB account is an invalid | | Passed | +| T3211720 | 1\. The MongoDB isn't used a auth.username/auth.password -> 2, Create the MongoDB source connector with value in the ""auth.password"" key-> 3. Start the pipeline -> the MongoDB system returns an error | | Passed | +| T3265897 | 1\. MongoDB isn't used a auth.username/auth.password -> and used certificate 2. Create the MongoDB source connector with value in the ""auth.password"" key-> 3. Start the pipeline -> the MongoDB system ignores the ""auth.password""; data is transferred | | Passed | +| T3211717 | The user can create the MongoDB source connector with a valid auth.password for the MongoDB account (example: some_password) | | Passed | +| T3211748 | \------- auth.db config | | Passed | +| T3211740 | Check that the ""auth.db"" key isn't necessary and the user can create the MongoDB source connector without (or an empty) this key (By default is admin) | | Passed | +| T3255927 | The MongoDB source can't be worked without ""auth.password""/""auth.username"" (or""auth.tls.certificateKeyFile""), ""auth.mechanism"" keys (Required) if enter a valid value in the ""auth.db"" config-> the system returns an error after starting the pipeline | | Passed | +| T3211736 | The user can enter any characters, letters, and digits in the ""auth.db"" key | | Passed | +| T3211737 | The user can create the MongoDB source connector with unlimited characters in the \`""auth.db"" key | | Passed | +| T3211738 | The system validates the auth.db name after starting the pipeline -> the system ignores this field and uses the default db name if the db name is an invalid; data is transferred | | Passed | +| T3211739 | The user can create the MingoDB source connector with a valid auth.db name ( The name of a database that contains the user's authentication data.) (We can use only ""admin"" auth.db name for free version) | | Passed | +| T3211747 | \------- auth.mechanism config | | Passed | +| T3211741 | Check that the ""auth.mechanism"" key isn't necessary. The user can create the MongoDB source connector without this key if the MongoDB isn't used auth.tlsCAFile/auth.tlsCertificateKeyFile | | Passed | +| T3211746 | The MongoDB source can't be worked without the ""auth.mechanism"" key (Required; need to use MONGODB-X509) if the MongoDB has an auth.tlsCertificateKeyFile-> the system returns an error after starting the pipeline | | Passed | +| T3211861 | The user can create MongoDB source without the ""auth.mechanism"" if MongoDB has auth.username/auth.password (The default mechanism that depends on your MongoDB server version -https://www.mongodb.com/docs/drivers/go/current/fundamentals/auth/#default) | | Passed | +| T3211742 | The user can't create the MongoDB source connector with an invalid value in the ""auth.mechanism"" key (for example enter any text or digets) -> the system returns an error | | Passed | +| T3211743 | The user can create the MongoDB source connector with lowercase value (scram-sha-256, scram-sha-1, mongodb-cr, mongodb-aws, mongodb-x509) in the ""auth.mechanis"" key -> the system transforms value to uppercase; data is transferred after sterting | | Passed | +| T3211744 | 1.MongoDB dosen't use auth.username/auth.password (or auth.tlsCAFile/auth.tlsCertificateKeyFile)-> 2.Enter valid value in the ""auth.mechanism"" config and create the MongoDB source connector -> the system returns an error after starting the pipeline | | Passed | +| T3211745 | The user can create the MongoDB source connector with a valid auth.mechanism (supported mechanisms: - password/username:SCRAM-SHA-256 SCRAM-SHA-1, MONGODB-CR, MONGODB-AWS; - certificates: MONGODB-X509) | | Passed | +| T3211751 | \------- auth.tlsCAFile config | | Passed | +| T3211752 | Check that the ""auth.tlsCAFile"" key isn't necessary. The user can create the MongoDB source connector without (or an empty) this key if the MongoDB doesn't use bundle of certificate authorities to trust when making a TLS connection | | Passed | +| T3211753 | Check that the Mongo source connector can't be worked without the ""auth.tlsCAFile"" key (Required) if the MongoDB uses a bundle of certificate authorities to trust when making a TLS connection -> the system returns an error after starting the pipeline | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | Skipped | +| T3211757 | The user can't create the MongoDB source connector with an invalid path in the ""auth.tlsCAFile"" key -> the system returns an error | | Passed | +| T3211756 | The system returns an error after running the pipeline if the user created the MongoDB source connector with invalid auth.tlsCAFile file | | Passed | +| T3211760 | MongoDB doesn't use Certificates. Create the MongoDB source with the next configs: 1. Enter valid value in ""auth.tlsCAFile"" & ""auth.tlsCertificateKeyFile""-> 2. Enter ""auth.mechanism"":""MONGODB-X509"" -> the system returns an error after starting | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | Skipped | +| T3211758 | The user can create the MongoDB source connector with a valid auth.tlsCAFile and file path | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | Skipped | +| T3211761 | \------- auth.tlsCertificateKeyFile config | | Passed | +| T3211762 | Check that the ""auth.tlsCertificateKeyFile"" key isn't necessary. The user can create the MongoDB source connector without (or an empty) this key if the MongoDB doesn't use the client certificate file or the client private key file | | Passed | +| T3211763 | Check that the Mongo source connector can't be worked without the ""auth.tlsCertificateKeyFile"" key (Required) if the MongoDB uses the client certificate file or the client private key file-> the system returns an error after starting the pipeline | | Passed | +| T3211767 | The user can't create the MongoDB source connector with an invalid path in the ""auth.tlsCertificateKeyFile"" key -> the system returns an error | | Passed | +| T3211768 | The system returns an error after running the pipeline if the user created the MongoDB source connector with invalid auth.tlsCertificateKeyFile | | Passed | +| T3211769 | MongoDB doesn't use Certificates. Create the MongoDB source with the next configs: 1. Enter valid value in ""auth.tlsCAFile"" & ""auth.tlsCertificateKeyFile""-> 2. Enter ""auth.mechanism"":""MONGODB-X509"" -> the system returns an error after starting | | Passed | +| T3265985 | 1\. MongoDB is used a auth.username/auth.password -> and isn't used certificate 2. Create the MongoDB source with value in the ""auth.tls.certificateKeyFile""-> 3. Start the pipeline -> the MongoDB system ignores the certificate; data is transferred | | Passed | +| T3211770 | The user can create the MongoDB source connector with a valid auth.tlsCertificateKeyFile and file path | | Passed | +| T3211776 | \----------------Required config------------------ | | Passed | +| T3211777 | The user can't create the MongoDB destination connector without the ""type"" key -> the system returns an error | | Passed | +| T3211778 | The user can't create the MongoDB destination connector with an empty (or an invalid format) in the ""type"" key -> the system returns an error | | Passed | +| T3211779 | Check that the user can create the MongoDB destination connector only with ""type"": ""TYPE_DESTINATION | | Passed | +| T3211780 | The user can't create the MongoDB destination connector without the ""plugin"" key -> the system returns an error | | Passed | +| T3211781 | The user can't create the MongoDB destination connector with an empty (or an invalid format) in the ""plugin"" key -> the system returns an error | | Passed | +| T3211782 | The user can create the MongoDB destination connector with a valid the plugin format | | Passed | +| T3211783 | The user can't create the MongoDB destination connector without the ""pipelineId"" key -> the system returns an error | | Passed | +| T3211784 | The user can't create the MongoDB destination connector with an empty (or an invalid ID) in the ""pipelineId"" key -> the system returns an error | | Passed | +| T3211785 | The user can create the MongoDB destination connector with a valid pipeline ID | | Passed | +| T3211786 | The user can't create the MongoDB destination connector without the ""name"" key -> the system returns an error | | Passed | +| T3211787 | Check that the MongoDB destination connector can't be created without the ""db"" key -> the system returns an error \`""db"" config value must be set\` | | Passed | +| T3211788 | The user can't create the MongoDB destination connector with empty the ""db"" key -> the system returns an error \`""db"" config value must be set\` | | Passed | +| T3211789 | The user can enter any characters, letters, and digits in the ""db"" key | | Passed | +| T3211790 | The user can't create the MongoDB destination connector with the long name (\`validate:""required,max=64"") in the ""db"" key -> the system returns an error \`""db"" config value is too long\` | | Passed | +| T3211791 | The system validates the db name after starting the pipeline -> the error ""database \\""example_name\\"" doesn't exist"" occurs if the db name is an invalid | | Passed | +| T3211792 | The user can create the MingoDB destination connector with a valid db name (The name of a database the connector must work with.) | | Passed | +| T3211793 | Check that the MongoDB destination connector can't be created without the ""collection"" key -> the system returns an error \`""collection"" config value must be set\` | | Passed | +| T3211794 | The user can't create the MongoDB destination connector with empty the ""collection"" key -> the system returns an error \`""collection"" config value must be set\` | | Passed | +| T3211795 | The user can enter any characters, letters, and digits in the ""collection"" key | | Passed | +| T3211796 | The user can create the MongoDB destination connector with unlimited characters in the ""collection"" key | | Passed | +| T3211797 | The system validates the collection name after starting the pipeline -> the error ""collection \\""example_collection\\"" doesn't exist"""" occurs if the collection name is an invalid | | Passed | +| T3211798 | The user can create the MingoDB destination connector with a valid collection name (
The name of a collection the connector must write to.) | | Passed | +| T3211799 | The user can't create the MongoDB destination connector when the Pipeline is running | | Passed | +| T3211800 | \--------------------- No required config ------------------- | | Passed | +| T3211856 | Check that the ""uri"" key isn't necessary and the user can create the MongoDB destination connector without (or an empty) this key (By default is mongodb://localhost:27017) | It isn't a connector issue

At the current time, we can't test this case

Know the issue with the MongoDB docker image that is documented on the official website (see https://www.mongodb.com/docs/ and https://www.mongodb.com/compatibility/deploying-a-mongodb-cluster-with-docker) | Skipped | +| T3211857 | The user can enter any characters, letters, and digits in the ""uri"" key | | Passed | +| T3211858 | The user can create the MongoDB destination connector with unlimited characters in the ""uri"" key | | Passed | +| T3211859 | The system validates the URI to connection to MongoDB after starting the pipeline -> the error occurs if the URL to connection to MongoDB is an invalid | | Passed | +| T3211860 | The user can create the MongoDB destination connector with a valid URI to connect to the MongoDB (The URI can contain host names, IPv4/IPv6 literals, or an SRV record."") | | Passed | +| T3211801 | \------- auth.username config | | Passed | +| T3211802 | Check that the ""auth.username"" key isn't necessary. The user can create the MongoDB destination connector without (or an empty) this key if the MongoDB isn't used a username/password authentication | | Passed | +| T3211803 | Check that the MongoDB destination connector can't be created without the ""auth.username"" key (Required) if the MongoDB has an auth.username/auth.password authentication enabled -> the system returns an error after starting the pipeline | | Passed | +| T3211804 | The user can enter any characters, letters, and digits in the ""auth.username"" key | | Passed | +| T3211805 | The user can create the MongoDB destination connector with unlimited characters in the ""auth.username"" key | | Passed | +| T3211806 | The system validates the auth.username for the MongoDB account after starting the pipeline -> the error occurs if the auth.username for the MongoDB account is an invalid | | Passed | +| T3211807 | 1\. The MongoDB isn't used a auth.username/auth.password -> 2. Create the MongoDB destination connector with value in the ""auth.username"" key-> 3. Start the pipeline -> the MongoDB system returns an error | | Passed | +| T3265907 | 1\. MongoDB isn't used a auth.username/auth.password -> and used certificate 2. Create the MongoDB dest connector with value in the ""auth.username"" key-> 3. Start the pipeline -> the MongoDB system ignores the ""auth.username""; data is transferred | | Passed | +| T3211808 | The user can create the MongoDB destination connector with a valid auth.username for MongoDB account (example: some_username) | | Passed | +| T3211809 | \------- auth.password config | | Passed | +| T3211810 | Check that the ""auth.password"" key isn't necessary. The user can create the MongoDB destination connector without (or an empty) this key if the MongoDB isn't used a auth.username/auth.password | | Passed | +| T3211811 | Check that the MongoDB destination connector can't be created without the ""auth.password"" key (Required) if the MongoDB has an auth.username/auth.password authentication enabled -> the system returns an error after starting the pipeline | | Passed | +| T3211812 | The user can enter any characters, letters, and digits in the ""auth.password"" key | | Passed | +| T3211813 | The user can create the MongoDB destination connector with unlimited characters in the ""auth.password"" key | | Passed | +| T3211814 | The system validates the auth.password for MongoDB account after starting the pipeline -> the error occurs if the password for MongoDB account is an invalid | | Passed | +| T3211815 | 1\. The MongoDB isn't used a auth.username/auth.password -> 2, Create the MongoDB destination connector with value in the ""auth.password"" key-> 3. Start the pipeline -> the MongoDB system returns an error | | Passed | +| T3265902 | 1\. MongoDB isn't used a auth.username/auth.password -> and used certificate 2. Create the MongoDB dest connector with value in the ""auth.password"" key-> 3. Start the pipeline -> the MongoDB system ignores the ""auth.password""; data is transferred | | Passed | +| T3211816 | The user can create the MongoDB destination connector with a valid auth.password for the MongoDB account (example: some_password) | | Passed | +| T3211817 | \------- auth.db config | | Passed | +| T3211818 | Check that the ""auth.db"" key isn't necessary and the user can create the MongoDB destination connector without (or an empty) this key (By default is admin) | | Passed | +| T3255932 | The MongoDB source can't be worked without ""auth.password""/""auth.username"" (or""auth.tls.certificateKeyFile""), ""auth.mechanism"" keys (Required) if enter a valid value in the ""auth.db"" config-> the system returns an error after starting the pipeline | | Passed | +| T3211819 | The user can enter any characters, letters, and digits in the ""auth.db"" key | | Passed | +| T3211820 | The user can create the MongoDB destination connector with unlimited characters in the \`""auth.db"" key | | Passed | +| T3211821 | The system validates the auth.db name after starting the pipeline -> the system ignores this field and uses the default db name if the db name is an invalid; data is transferred | | Passed | +| T3211822 | The user can create the MongoDB destination connector with a valid auth.db name (
The name of a database that contains the user's authentication data.) | | Passed | +| T3211823 | \------- auth.mechanism config | | Passed | +| T3211824 | Check that the ""auth.mechanism"" key isn't necessary. The user can create the MongoDB destination connector without this key if the MongoDB isn't used auth.tlsCAFile/auth.tlsCertificateKeyFile | | Passed | +| T3211862 | The MongoDB destination can't be worked without the ""auth.mechanism"" key (Required; need to use MONGODB-X509) if the MongoDB has an auth.tlsCertificateKeyFile -> the system returns an error after starting the pipeline | | Passed | +| T3211863 | The user can create MongoDB dest without the ""auth.mechanism"" if MongoDB has auth.username/auth.password (The default mechanism that depends on your MongoDB server version -https://www.mongodb.com/docs/drivers/go/current/fundamentals/auth/#default) | | Passed | +| T3211825 | The user can't create the MongoDB destination connector with an invalid value in the ""auth.mechanism"" key (for example enter any text or digets) -> the system returns an error | | Passed | +| T3211826 | The user can create the MongoDB Destination with lowercase value (scram-sha-256, scram-sha-1, mongodb-cr, mongodb-aws, mongodb-x509) in the ""auth.mechanis"" key -> the system transforms value to uppercase; data is transferred after starting | | Passed | +| T3211827 | 1.MongoDB dosen't use auth.username/auth.password (or auth.tlsCAFile/auth.tlsCertificateKeyFile)-> 2.Enter valid value in the ""auth.mechanism"" config and create the MongoDB destination -> the system returns an error after starting the pipeline | | Passed | +| T3211828 | The user can create the MongoDB destination connector with a valid auth.mechanism (supported mechanisms: - password/username:SCRAM-SHA-256 SCRAM-SHA-1, MONGODB-CR, MONGODB-AWS; - certificates: MONGODB-X509) | | Passed | +| T3211829 | \------- auth.tlsCAFile config | | Passed | +| T3211830 | Check that the ""auth.tlsCAFile"" key isn't necessary. The user can create the MongoDB destination connector without (or an empty) this key if the MongoDB doesn't use a bundle of certificate authorities to trust when making a TLS connection | | Passed | +| T3211831 | Check that the Mongo Dest connector can't be worked without the ""auth.tlsCAFile"" key (Required) if the MongoDB uses a bundle of certificate authorities to trust when making a TLS connection -> the system returns an error after starting the pipeline | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB

| Skipped | +| T3211835 | The user can't create the MongoDB destination connector with an invalid path in the ""auth.tlsCAFile"" key -> the system returns an error | | Passed | +| T3211836 | The system returns an error after running the pipeline if the user created the MongoDB destination connector with invalid auth.tlsCAFile file | | Passed | +| T3211837 | MongoDB doesn't use Certificates. Create the MongoDB destination with the configs: 1. Enter valid value in ""auth.tlsCAFile"" & ""auth.tlsCertificateKeyFile""-> 2. Enter ""auth.mechanism"":MONGODB-X509"" -> the system returns an error after starting | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | Skipped | +| T3211838 | The user can create the MongoDB destination connector with a valid auth.tlsCAFile and file path | We can't check this case in the current version of MongoDB

The certificates are used in the old version of MongoDB | Skipped | +| T3211839 | \------- auth.tlsCertificateKeyFile config | | Passed | +| T3211840 | Check that the ""auth.tlsCertificateKeyFile"" key isn't necessary. The user can create the MongoDB destination connector without (or an empty) this key if the MongoDB doesn't use the client certificate file or the client private key file | | Passed | +| T3211841 | Check that the Mongo destinatio connector can't be worked without the ""auth.tlsCertificateKeyFile"" key (Required) if the MongoDB uses the client certificate file or the client private key file-> the system returns an error after starting the pipeline | | Passed | +| T3211845 | The user can't create the MongoDB destination connector with an invalid path in the ""auth.tlsCertificateKeyFile"" key -> the system returns an error | | Passed | +| T3211846 | The system returns an error after running the pipeline if the user created the MongoDB destination connector with invalid auth.tlsCertificateKeyFile | | Passed | +| T3211847 | MongoDB doesn't use Certificates. Create the MongoDB destination with the next configs: 1. Enter valid value in ""auth.tlsCAFile"" & ""auth.tlsCertificateKeyFile""-> 2. Enter ""auth.mechanism"":""MONGODB-X509"" -> the system returns an error after starting | | Passed | +| T3265980 | 1\. MongoDB is used a auth.username/auth.password -> and isn't used certificate 2. Create the MongoDB dest with value in the ""auth.tls.certificateKeyFile""-> 3. Start the pipeline -> the MongoDB system ignores the certificate; data is transferred | | Passed | +| T3211848 | The user can create the MongoDB destination connector with a valid auth.tlsCertificateKeyFile and file path | | Passed | +| T3211865 | The data from the MongoDB source connector is transferred to the Postgres Destination connector | | Passed | +| T3211866 | The data from the MongoDB source connector is transferred to the Materialize Destination connector | | Passed | +| T3211889 | The data from the MongoDB source connector is transferred to the NATS Pub/Sub Destination connector | | Passed | +| T3211893 | The data from the MongoDB source connector is transferred to the Vitess Destination connector | | Passed | +| T3211907 | The data from the MongoDB source connector is transferred to the Clickhouse Destination connector | | Passed | +| T3211908 | The data from the MongoDB source connector is transferred to the HubSpot Destination connector | | Passed | +| T3211909 | The data from the MongoDB source connector is transferred to the SQL Server Destination connector | | Passed | +| T3211965 | The data from the MongoDB source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211867 | Data from two different MongoDB source connectors are transferred to the Postgres Destination connector | | Passed | +| T3211868 | Data from two different MongoDB source connectors are transferred to the Materialize Destination connector | | Passed | +| T3211890 | Data from two different MongoDB source connectors are transferred to the NATS PubSub Destination connector | | Passed | +| T3211894 | Data from two different MongoDB source connectors are transferred to the Vitess Destination connector | | Passed | +| T3211910 | Data from two different MongoDB source connectors are transferred to the Clickhouse Destination connector | | Passed | +| T3211911 | Data from two different MongoDB source connectors are transferred to the SQL Server Destination connector | | Passed | +| T3211966 | Data from two different MongoDB source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3211869 | The user can transfer data using one pipeline from two (or more) different MongoDB sources to two (or more) Destinations: Materialize, S3, File, Kafka, NATS Pub/Subp, NATS JS, GCP Pub/Sub and other | | Passed | +| T3211870 | Try to transfer data from two MongoDB Sources with the same config to the Postgres Destination -> data is transferred and isn't duplicated | | Passed | +| T3211871 | Try to transfer data from two MongoDB Sources with the same config to the Materialize Destination - > data is transferred and duplicated | | Passed | +| T3211891 | Try to transfer data from two MongoDB Sources with the same config to the NATS PubSub Destination - > data is transferred and duplicated | | Passed | +| T3211884 | Try to transfer data from two MongoDB Sources with the same config to the SQL Server Destination - > the data is recorded and duplicated (an error occurs if SQL Server table with a primary key) | | Passed | +| T3211895 | Try to transfer data from two MongoDB Sources with the same config to the Vitess Destination - > data is recorded and is duplicated (data isn't duplicated if used a primary key) | | Passed | +| T3211912 | Try to transfer data from two MongoDB Server Sources with the same config to the Clickhouse Destination - > data is transferred and is duplicated in the Clickhouse table (data isn't duplicated if the primary key is used in the Clickhouse table) | | Passed | +| T3211967 | Try to transfer data from two MongoDB Sources with the same config to the MongoDB Destination - > data is transferred and duplicated (is transferred once and an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211913 | Try to transfer data from two MongoDB Sources with the same config to the HubSpot Destination - > data is transferred (data is recorded once and an error occurs for resourse ""cms.blogs.authors"") | | Passed | +| T3211872 | 1.Stop the pipeline and remove MongoDB Source-> 2. create MongoDB Source with the same config from the 1 step-> 3. Start pipeline -> data is recorded to the destination (add.info: data in Postgres, Vitess, and DB2 is refreshed if used primary key) | | Passed | +| T3211873 | 1\. Create one MongoDB Source -> 2. Create two Postgres Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated (data isn't duplicated if used primary key) | | Passed | +| T3211874 | 1\. Create one MongoDB Source -> 2. Create two Materialize Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated | | Passed | +| T3211892 | 1\. Create one MongoDB Source -> 2. Create two NATS PubSub Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated | | Passed | +| T3211896 | 1\. Create one MongoDB Source -> 2. Create two Vitess Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated (data isn't duplicated if used primary key) | | Passed | +| T3211914 | 1\. Create one MongoDB Source -> 2. Create two SQL Server Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated (an error occurs if SQL Server table with Primary Key) | | Passed | +| T3211915 | 1\. Create one MongoDB Source -> 2. Create two HubSpot Destination connectors with the same config -> 3. Start the pipeline -> data is recorded and duplicated (data is recorded once and an error occurs for resourse ""cms.blogs.authors"") | | Passed | +| T3211916 | 1\. Create one MongoDB Source -> 2. Create two Clickhouse Destination connectors with the same config -> 3. Start the pipeline -> data is transferred and is duplicated in Clickhouse table | | Passed | +| T3211968 | 1\. Create one MongoDB Source -> 2. Create two MongoDB Destination connectors with the same config-> 3. Start the pipeline -> data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211875 | 1.Create MongoDB Source->2. Create Destination-> 3. Start the pipeline-> the data is recorded to the Destination-> 4. Stop the pipeline and add the new Destination -> 5. Start the pipeline -> the previous data isn't recorded to the new Destination | | Passed | +| T3211876 | The new data that is added to the MongoDB table is transferred to the Destination while the pipeline is running (or after restarting the pipeline ) | | Passed | +| T3211877 | Check that error occurs after trying transferred data from the MongoDB table which isn't used in the Vitess table | | Passed | +| T3211885 | Check that error occurs after trying transferred data from the MongoDB table which isn't used in the Postgres table | | Passed | +| T3211886 | Check that error occurs after trying transferred data from the MongoDB table which isn't used Materialize table | | Passed | +| T3211917 | Check that error occurs after trying transferred data from the MongoDB table which isn't used the SQL Server table | | Passed | +| T3211918 | Check that error occurs after trying transferred data from the MongoDB table which isn't used the Clickhouse table | | Passed | +| T3211919 | Check that error occurs after trying transferred data from the MongoDB table which isn't used the HubSpot table | | Passed | +| T3211969 | Check that error occurs after trying transferred data from the MongoDB table which isn't used the MongoDB table | | Passed | +| T3211878 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from Postgres table | | Passed | +| T3211887 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from Materialize table | | Passed | +| T3211898 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from Vitess table | | Passed | +| T3211920 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from SQL Server table | | Passed | +| T3211921 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data isn't removed from the Clickhouse table ( https://clickhouse.com/docs/en/engines/table-engines/#log) | | Passed | +| T3211923 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from the Clickhouse table | | Passed | +| T3211905 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data isn't removed from the HubSpot; data is duplicated that was removed from MongoDB | | Passed | +| T3211970 | Remove data in MongoDB table while running pipeline (or restart run pipeline) -> data is removed from MongoDB table | | Passed | +| T3211879 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in Postgres table | | Passed | +| T3211888 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in Materialize table | | Passed | +| T3211899 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in Vitess table (update doesn't work for the Vitess table that isn't used primaryKey; updated data is recorded as new and previous data isn't updated) | | Passed | +| T3211906 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in SQL Server table | | Passed | +| T3211925 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in data is updated in the Clickhouse table | | Passed | +| T3211926 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data isn't updated in the Clickhouse table ( https://clickhouse.com/docs/en/engines/table-engines/#log) | | Passed | +| T3211927 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data isn't updated in the HubSpot; old data is duplicated that was updated from MongoDB | | Passed | +| T3211971 | Update data in MongoDB while running pipeline (or restart run pipeline) -> data is updated in MongoDB table | | Passed | +| T3211864 | 1\. The user creates the MongoDB source and destination connectors with the identical config -> 2. Start the pipeline -> data isn't transferred; an error ""E11000 duplicate key error"" occurs | | Passed | +| T3211880 | The user can use the MongoDB Source connector with the identical config in two (or more) pipelines at one time | | Passed | +| T3211897 | \-------- the MongoDB Source connector with not required config ---------------------- | | Passed | +| T3211881 | 1\. Create the MongoDB Source connector with {""batchSize"":""1""} -> 2. crate the Destination connector -> 3. Start the pipeline -> data is transferred | | Passed | +| T3211882 | 1\. Create the MongoDB Source connector with {""batchSize"":""100000""} -> 2. crate the Destination connector -> 3. Start the pipeline -> data is transferred | | Passed | +| T3211900 | 1.Create the Mongo source with {""OrderingColumn"":""id""}-> 2. Start pipeline-> data is transferred by ""id"" ordering (2;4;15;4)-> 3. add data ""id"":""1"" in Mongo-> data id:15; 1 is transferred-> 4. add data with ""id"":""16"" in Mongo-> data id:16 transfers | | Passed | +| T3211972 | 1\. Create the MongoDB source connector with {""snapshot"":""true""}-> 2. create the Destination-connector-> 3. start the pipeline -> data that was added before starting the pipeline will be transferred | | Passed | +| T3211973 | 1\. Create the MongoDB source connector with {""snapshot"":""false""}-> 2. create the Destination connector -> 3. start the pipeline -> data that was added before starting the pipeline will not be transferred | | Passed | +| T3211928 | The data from the Stripe source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211929 | The data from the Snowflake source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211930 | The data from the NATS PubSub source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211931 | The data from the Oracle source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211954 | The data from the Vitess source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211957 | The data from the SQL Server source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211958 | The data from the HubSpot source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211959 | The data from the Clickhouse source connector is transferred to the MongoDB Destination connector | | Passed | +| T3218670 | The data from the MongoDB source connector is transferred to the MongoDB Destination connector | | Passed | +| T3211932 | Data from two different Stripe source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3211933 | Data from two different Snowflake source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3211951 | Data from two different NATS PubSub source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3211955 | Data from two different Oracle source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3211952 | Data from two different Vitess source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3223164 | Data from two different Clickhouse source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3211960 | Data from two different SQL Server source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3211961 | Data from two different MongoDB source connectors are transferred to the MongoDB Destination connector | | Passed | +| T3211934 | Transfer data from the Stripe Sources to two MongoDB Destination with the same config - >data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211935 | Transfer data from the Snowflake Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211936 | Transfer data from the NATS PubSub Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211953 | Transfer data from the Vitess Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211956 | Transfer data from the Oracle Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211962 | Transfer data from the Clickhouse Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211963 | Transfer data from the HubSpot Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211964 | Transfer data from the SQL Server Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211974 | Transfer data from the MongoDB Sources to two MongoDB Destination with the same config - > data is transferred and duplicated (is transferred once and an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211937 | Try to transfer data from two Source connectors with the same config to the MongoDB Destination - > data is transferred and duplicated (is transferred once; an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211938 | 1\. Create the Source and MongoDB Dest-> 2. Start the pipeline -> the data is recorded to MongoDB Dest-> 3. Stop the pipeline and add the new MongoDB Dest -> 4. Start the pipeline -> previous data isn't recorded to the new MongoDB Destination | | Passed | +| T3211939 | 1\. Stop the pipeline and remove the Source-> 2. create the Source with the same config from 1st step -> 3. Start pipeline -> data is duplicated to MongoDB Dest (isn't transferred and an error occurs if the ""_id"" field is transferred in the payload) | | Passed | +| T3211940 | The user can transfer data using the one pipeline from two (or more) different sources: Stripe, S3, File, Kafka, NATS JS, NATS P/S, HTTP, GCP P/S to two (or more) different MongoDB Destination | | Passed | +| T3211941 | The new data that is added to the Source connector is transferred to the MongoDB Destination while the pipeline is running (or after restarting the pipeline ) | | Passed | +| T3211942 | Check that error occurs after trying to transfer data from the Source connector which isn't used in the MongoDB table | | Passed | +| T3211943 | Remove data in Stripe (or Vitess) while running pipeline (or restart run pipeline) -> data is removed from the MongoDB table | | Passed | +| T3211944 | Update data in Stripe (or Vitess) while running pipeline (or restart run pipeline) -> data is updated in the MongoDB table | | Passed | +| T3211945 | Check that data is transferred from different resources of the Stripe connector (resources list https://github.com/ConduitIO/conduit-connector-stripe/tree/main/models/resources ) to the MongoDB Destination | | Passed | +| T3211946 | Transfer data with column names in the upper case (example: {""CITY"":""Paris""}) -> the data is transferred to the MongoDB Destination PAY ATTENTION: column names must be the same case in Source and MongoDB Destination or an error will occur) | | Passed | +| T3211947 | Transfer data with column names in the lower case (example: {""city"":""Paris""}) -> the data is transferred to the MongoDB Destination PAY ATTENTION: column names must be the same case in Source and MongoDB Destination or an error will occur) | | Passed | +| T3211948 | Transfer data with column names in the camel case (example: {""CiTy"":""Paris""}) -> the data is transferred to the MongoDB Destination PAY ATTENTION: column names must be the same case in Source and MongoDB Destination or an error will occur) | | Passed | +| T3211950 | 1\. Create the MongoDB source connector -> 2. Create the MongoDB destination connector with the same config as the MongoDB source -> 3. Start the pipeline -> data isn't transmitted; an error ""E11000 duplicate key error"" occurs | | Passed | +| T3211949 | The user can use the MongoDB Destination connector with the identical config in two (or more) pipelines at one time | | Passed | diff --git a/docs/qa/test_run_2_8_2023___functional_testing_azure_cosmos_db_for_mongodb_source_destination_connector.md b/docs/qa/test_run_2_8_2023___functional_testing_azure_cosmos_db_for_mongodb_source_destination_connector.md new file mode 100644 index 0000000..fefb4b5 --- /dev/null +++ b/docs/qa/test_run_2_8_2023___functional_testing_azure_cosmos_db_for_mongodb_source_destination_connector.md @@ -0,0 +1,149 @@ +**Test Run 2/8/2023 - Functional testing Azure Cosmos DB for MongoDB Source/Destination connector** + +**branch:** https://github.com/conduitio-labs/conduit-connector-mongo/tree/destination + +**commit** 4736288cfafd27f173cec45fded2453d6e76cc7a + +| ID | Title | Status | Comment | +| -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------- | ------------------------------------------ | +| T3433115 | \----------------Required config------------------ | Passed | | +| T3433094 | The user can't create the Azure Cosmos DB for MongoDB source connector without the "type" key -> the system returns an error | Passed | | +| T3433099 | The user can't create the Azure Cosmos DB for MongoDB source connector with an empty (or an invalid format) in the "type" key -> the system returns an error | Passed | | +| T3433097 | Check that the user can create the Azure Cosmos DB for MongoDB source connector only with "type": "TYPE_SOURCE" | Passed | | +| T3433095 | The user can't create the Azure Cosmos DB for MongoDB source connector without the "plugin" key -> the system returns an error | Passed | | +| T3433098 | The user can't create the Azure Cosmos DB for MongoDB source connector with an empty (or an invalid format) in the "plugin" key -> the system returns an error | Passed | | +| T3433100 | The user can create the Azure Cosmos DB for MongoDB source connector with a valid the plugin format | Passed | | +| T3433096 | The user can't create the Azure Cosmos DB for MongoDB source connector without the "pipelineId" key -> the system returns an error | Passed | | +| T3433101 | The user can't create the Azure Cosmos DB for MongoDB source connector with an empty (or an invalid ID) in the "pipelineId" key -> the system returns an error | Passed | | +| T3433102 | The user can create the Azure Cosmos DB for MongoDB source connector with a valid pipeline ID | Passed | | +| T3433112 | Check that the Azure Cosmos DB for MongoDB source connector can't be created without the "db" key -> the system returns an error \`"db" config value must be set\` | Passed | | +| T3433113 | The user can't create the Azure Cosmos DB for MongoDB source connector with empty the "db" key -> the system returns an error \`"db" config value must be set\` | Passed | | +| T3433114 | The user can enter any characters, letters, and digits in the "db" key | Passed | | +| T3433117 | The user can't create the Azure Cosmos DB for MongoDB source connector with the long name (\`validate:"required,max=64") in the "db" key -> the system returns an error \`"db" config value is too long\` | Passed | | +| T3433118 | The system validates the db name after starting the pipeline -> the error "database \\"example_name\\" doesn't exist" occurs if the db name is an invalid | Passed | | +| T3433119 | The user can create the Azure Cosmos DB for MongoDB source connector with a valid db name (The name of a database the connector must work with.) | Passed | | +| T3433122 | Check that the Azure Cosmos DB for MongoDB source connector can't be created without the "collection" key -> the system returns an error \`"collection" config value must be set\` | Passed | | +| T3433123 | The user can't create the Azure Cosmos DB for MongoDB source connector with empty the "collection" key -> the system returns an error \`"collection" config value must be set\` | Passed | | +| T3433124 | The user can enter any characters, letters, and digits in the "collection" key | Passed | | +| T3433125 | The user can create the Azure Cosmos DB for MongoDB source connector with unlimited characters in the "collection" key | Passed | | +| T3433126 | The system validates the collection name after starting the pipeline -> the error "collection \\"example_collection\\" doesn't exist"" occurs if the collection name is an invalid | Passed | | +| T3433127 | The user can create the Azure Cosmos DB for MongoDB source connector with a valid collection name (
The name of a collection the connector must read from.) | Passed | | +| T3433104 | The user can't create the Azure Cosmos DB for MongoDB source connector when the Pipeline is running | Passed | | +| T3433116 | \--------------------- No required config ------------------- | Passed | | +| T3433093 | Check that the "uri" key isn't necessary and the user can create the Azure Cosmos DB for MongoDB source connector without (or an empty) this key (By default is mongodb://localhost:27017) | Skipped | Can't be reproduced on Version Mongo 6.0.1 | +| T3433111 | The user can enter any characters, letters, and digits in the "uri" key | Passed | | +| T3433121 | The user can create the Azure Cosmos DB for MongoDB source connector with unlimited characters in the "uri" key | Passed | | +| T3433110 | The system validates the URI to connection to Azure Cosmos DB for MongoDB after starting the pipeline -> the error occurs if the URI to connection to Azure Cosmos DB for MongoDB is an invalid | Passed | | +| T3433103 | The user can create the Azure Cosmos DB for MongoDB source connector with a valid URI to connect to the Azure Cosmos DB for MongoDB (The URI can contain host names, IPv4/IPv6 literals, or an SRV record.") | Passed | | +| T3433139 | \------- batchSize config | Passed | | +| T3433105 | Check that the "batchSize" key isn't necessary and the user can create the Azure Cosmos DB for MongoDB source connector without (or an empty) this key (By default the count of records in one batch is 1000) | Passed | | +| T3433106 | The user can't create the Azure Cosmos DB for MongoDB source connector with an invalid value in the "batchSize" key (for example: enter text, the floating-point) -> the system is returned an error \`batchSize" config value must be int\` | Passed | | +| T3433108 | The user can't create the Azure Cosmos DB for MongoDB source connector with a value of more than 100000 in the "batchSize" key -> the system returns an error "\\"batchSize\\" value must be less than or equal to 100000" | Passed | | +| T3433109 | The user can't create the Azure Cosmos DB for MongoDB source connector with a value of fewer than 1 in the "batchSize" key -> the system returns an error | Passed | | +| T3433107 | The user can create the Azure Cosmos DB for MongoDB source connector with a valid number of records (for example 50) in the "batchSize" key | Passed | | +| T3433120 | The user can create the Azure Cosmos DB for MongoDB source connector with config "batchSize":"+00010" -> the system parses like "1" | Passed | | +| T3433128 | \------- snapshot config | Passed | | +| T3433129 | Check that the "snapshot" key isn't necessary and the user can create the Azure Cosmos DB for MongoDB source connector without (or an empty) this key (By default is true -> data that was added before starting the pipeline will be transferred) | Passed | | +| T3433130 | The user can't create the Azure Cosmos DB for MongoDB source connector with an invalid values in the "snapshot" key (for example enter any text or digits, except for valid values) -> the system returns an error | Passed | | +| T3433131 | The user can create the Azure Cosmos DB for MongoDB source connector with a valid value in the "snapshot" key ("1", "t", "T", "true", "TRUE", "True") -> data that was added before starting the pipeline will be transferred | Passed | | +| T3433132 | Create Azure Cosmos DB for MongoDB source with "snapshot":"false" ("0","f","F","false","FALSE","False")-> data that was added before starting the pipeline will not be transferred (if add data during pause, data will be transferred after restarting) | Passed | | +| T3433138 | \------- orderingField config | Passed | | +| T3433133 | Check that the "orderingField" key isn't necessary and the user can create the Azure Cosmos DB for MongoDB source connector without (or an empty) this key (By default is "_id") | Passed | | +| T3433134 | The user can enter any characters, letters, and digits in the "orderingField" key | Passed | | +| T3433135 | The user can create the Azure Cosmos DB for MongoDB source connector with an unlimited number of characters in the "orderingField" config | Passed | | +| T3433136 | Create Azure Cosmos DB for MongoDB Source with an invalid field name that will be used for ordering-> the system ignores an invalid name (There will be no orderingField validation, as we can have different schemes of different collection documents) | Passed | | +| T3433137 | The user can create the Azure Cosmos DB for MongoDB source connector with a valid name of the field which using for ordering (the user can use any existing field; ATTENTION: the field must use unique index in MongoDB) | Passed | | +| T3433140 | \----------------Required config------------------ | Passed | | +| T3433141 | The user can't create the Azure Cosmos DB for MongoDB destination connector without the "type" key -> the system returns an error | Passed | | +| T3433142 | The user can't create the Azure Cosmos DB for MongoDB destination connector with an empty (or an invalid format) in the "type" key -> the system returns an error | Passed | | +| T3433143 | Check that the user can create the Azure Cosmos DB for MongoDB destination connector only with "type": "TYPE_DESTINATION | Passed | | +| T3433144 | The user can't create the Azure Cosmos DB for MongoDB destination connector without the "plugin" key -> the system returns an error | Passed | | +| T3433145 | The user can't create the Azure Cosmos DB for MongoDB destination connector with an empty (or an invalid format) in the "plugin" key -> the system returns an error | Passed | | +| T3433146 | The user can create the Azure Cosmos DB for MongoDB destination connector with a valid the plugin format | Passed | | +| T3433147 | The user can't create the Azure Cosmos DB for MongoDB destination connector without the "pipelineId" key -> the system returns an error | Passed | | +| T3433148 | The user can't create the Azure Cosmos DB for MongoDB destination connector with an empty (or an invalid ID) in the "pipelineId" key -> the system returns an error | Passed | | +| T3433149 | The user can create the Azure Cosmos DB for MongoDB destination connector with a valid pipeline ID | Passed | | +| T3433151 | Check that the Azure Cosmos DB for MongoDB destination connector can't be created without the "db" key -> the system returns an error \`"db" config value must be set\` | Passed | | +| T3433152 | The user can't create the Azure Cosmos DB for MongoDB destination connector with empty the "db" key -> the system returns an error \`"db" config value must be set\` | Passed | | +| T3433153 | The user can enter any characters, letters, and digits in the "db" key | Passed | | +| T3433154 | The user can't create the Azure Cosmos DB for MongoDB destination connector with the long name (\`validate:"required,max=64") in the "db" key -> the system returns an error \`"db" config value is too long\` | Passed | | +| T3433155 | The system validates the db name after starting the pipeline -> the error "database \\"example_name\\" doesn't exist" occurs if the db name is an invalid | Passed | | +| T3433156 | The user can create the Azure Cosmos DB for MongoDB destination connector with a valid db name (The name of a database the connector must work with.) | Passed | | +| T3433157 | Check that the Azure Cosmos DB for MongoDB destination connector can't be created without the "collection" key -> the system returns an error \`"collection" config value must be set\` | Passed | | +| T3433158 | The user can't create the Azure Cosmos DB for MongoDB destination connector with empty the "collection" key -> the system returns an error \`"collection" config value must be set\` | Passed | | +| T3433159 | The user can enter any characters, letters, and digits in the "collection" key | Passed | | +| T3433160 | The user can create the Azure Cosmos DB for MongoDB destination connector with unlimited characters in the "collection" key | Passed | | +| T3433161 | The system validates the collection name after starting the pipeline -> the error "collection \\"example_collection\\" doesn't exist"" occurs if the collection name is an invalid | Passed | | +| T3433162 | The user can create the Azure Cosmos DB for MongoDB destination connector with a valid collection name (
The name of a collection the connector must write to.) | Passed | | +| T3433163 | The user can't create the Azure Cosmos DB for MongoDB destination connector when the Pipeline is running | Passed | | +| T3433164 | \--------------------- No required config ------------------- | Passed | | +| T3433165 | Check that the "uri" key isn't necessary and the user can create the Azure Cosmos DB for MongoDB destination connector without (or an empty) this key (By default is mongodb://localhost:27017) | Skipped | Can't be reproduced on Version Mongo 6.0.1 | +| T3433166 | The user can enter any characters, letters, and digits in the "uri" key | Passed | | +| T3433167 | The user can create the Azure Cosmos DB for MongoDB destination connector with unlimited characters in the "uri" key | Passed | | +| T3433168 | The system validates the URI to connect to MongoDB after starting the pipeline -> the error occurs if the URL to connect to MongoDB is an invalid | Passed | | +| T3433169 | The user can create the Azure Cosmos DB for MongoDB destination connector with a valid URI to connect to the MongoDB (The URI can contain host names, IPv4/IPv6 literals, or an SRV record.") | Passed | | +| T3447653 | The data from the Azure Cosmos DB for MongoDB source connector is transferred to the Materialize Destination connector | Passed | | +| T3433177 | The data from the Azure Cosmos DB for MongoDB source connector is transferred to the NATS Pub/Sub Destination connector | Passed | | +| T3447660 | The data from the Azure Cosmos DB for MongoDB source connector is transferred to the MongoDB Destination connector | Passed | | +| T3433206 | The data from the Azure Cosmos DB for MongoDB source connector is transferred to the SAP HANA Destination connector | Passed | | +| T3433207 | The data from the Azure Cosmos DB for MongoDB source connector is transferred to the Neo4j Destination connector | Passed | | +| T3447654 | Data from two different Azure Cosmos DB for MongoDB source connectors are transferred to the Materialize Destination connector | Passed | | +| T3433178 | Data from two different Azure Cosmos DB for MongoDB source connectors are transferred to the NATS PubSub Destination connector | Passed | | +| T3447661 | Data from two different Azure Cosmos DB for MongoDB source connectors are transferred to the MongoDB Destination connector | Passed | | +| T3433208 | Data from two different Azure Cosmos DB for MongoDB source connectors are transferred to the SAP HANA Destination connector | Passed | | +| T3433209 | Data from two different Azure Cosmos DB for MongoDB source connectors are transferred to the Neo4j Destination connector | Passed | | +| T3433170 | The user can transfer data using one pipeline from two (or more) different MongoDB sources to two (or more) Destinations: Materialize, S3, File, Kafka, NATS Pub/Subp, NATS JS, GCP Pub/Sub and other | Passed | | +| T3447655 | Try to transfer data from two Azure Cosmos DB for MongoDB Sources with the same config to the Materialize Destination - > data is transferred and duplicated | Passed | | +| T3433179 | Try to transfer data from two Azure Cosmos DB for MongoDB Sources with the same config to the NATS PubSub Destination - > data is transferred and duplicated | Passed | | +| T3447662 | Try to transfer data from two Azure Cosmos DB for MongoDB Sources with the same config to the MongoDB Destination - > data is transferred and duplicated (is transferred once and an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3433210 | Try to transfer data from two Azure Cosmos DB for MongoDB Sources with the same config to the Neo4j Destination - > data is transferred and duplicated | Passed | | +| T3433211 | Try to transfer data from two Azure Cosmos DB for MongoDB Sources with the same config to the SAP HANA Destination - > data is recorded and is duplicated (data isn't duplicated if used a primary key) | Passed | | +| T3433171 | The user can edit the Azure Cosmos DB for MongoDB Source connector (for example: change the value in the "collection " key) and start the pipeline -> new source data is transferred to the destination | Passed | | +| T3433172 | 1\. Stop pipeline and remove Azure Cosmos DB for MongoDB Source-> 2. create Source with same configs from 1 step-> 3. Start pipeline-> data is recorded to the dest (add.info: data in Postgres, Vitess, and DB2 is refreshed if used primary key) | Passed | | +| T3447656 | 1\. Create one Azure Cosmos DB for MongoDB Source -> 2. Create two Materialize Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated | Passed | | +| T3433180 | 1\. Create one Azure Cosmos DB for MongoDB Source -> 2. Create two NATS PubSub Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated | Passed | | +| T3447663 | 1\. Create one Azure Cosmos DB for MongoDB Source -> 2. Create two MongoDB Destinations with the same config -> 3. Start pipeline-> data is transferred and duplicated (transferred once; an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3433212 | 1\. Create one Azure Cosmos DB for MongoDB Source -> 2. Create two Neo4j Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated | Passed | | +| T3433213 | 1\. Create one Azure Cosmos DB for MongoDB Source -> 2. Create two SAP HANA Destination connectors with the same config -> 3. Start the pipeline -> the data is recorded and duplicated (data isn't duplicated if used primary key) | Passed | | +| T3433173 | The new data that is added to the Azure Cosmos DB for MongoDB table is transferred to the Destination while the pipeline is running (or after restarting the pipeline ) | Passed | | +| T3447657 | Check that error occurs after trying to transfer data from the Azure Cosmos DB for MongoDB which isn't used Materialize table | Passed | | +| T3433183 | Check that error occurs after trying to transfer data from the Azure Cosmos DB for MongoDB which isn't used the SAP HANA table | Passed | | +| T3447658 | Remove data in Azure Cosmos DB for MongoDB while running pipeline (or restart run pipeline) -> data isn't removed from Materialize table | Passed | | +| T3447664 | Remove data in Azure Cosmos DB for MongoDB while running pipeline (or restart run pipeline) -> data isn't removed from MongoDB | Passed | | +| T3433202 | Remove data in Azure Cosmos DB for MongoDB while running pipeline (or restart run pipeline) -> data isn't removed from SAP HANA table | Passed | | +| T3433203 | Remove data in Azure Cosmos DB for MongoDB while running pipeline (or restart run pipeline) -> data isn't removed from Neo4j | Passed | | +| T3447659 | Update data in Azure Cosmos DB for MongoDB while running pipeline (or restart run pipeline) -> data isn't updated in Materialize table | Passed | | +| T3447665 | Update data in Azure Cosmos DB for MongoDB while running pipeline (or restart run pipeline) -> data isn't updated in MongoDB | Passed | | +| T3433204 | Update data in Azure Cosmos DB for MongoDB while running pipeline (or restart run pipeline) -> data isn't updated in SAP HANA table | Passed | | +| T3433205 | Update data in Azure Cosmos DB for MongoDB while running pipeline (or restart run pipeline) -> data isn't updated in Neo4j | Passed | | +| T3433174 | The user can use the Azure Cosmos DB for MongoDB Source connector with the identical config in two (or more) pipelines at one time | Passed | | +| T3433181 | \-------- the Azure Cosmos DB for MongoDB Source connector with not required config ---------------------- | Passed | | +| T3433175 | 1\. Create the Azure Cosmos DB for MongoDB Source connector with {"batchSize":"1"} -> 2. crate the Destination connector -> 3. Start the pipeline -> data is transferred | Passed | | +| T3433176 | 1\. Create the Azure Cosmos DB for MongoDB Source connector with {"batchSize":"100000"} -> 2. crate the Destination connector -> 3. Start the pipeline -> data is transferred | Passed | | +| T3433182 | 1\. Create Azure Cosmos DB for MongoDB source with {"OrderingColumn":"id"}->2. Start pipeline-> data is recorded by "id" ordering (2;4;15)->3. add data "id":"1" in Mongo-> data "id":"1" isn't recorded->4. add data "id":"16"-> data "id":"16" is recorded | Passed | | +| T3433184 | 1\. Create the Azure Cosmos DB for MongoDB source connector with {"snapshot":"true"}-> 2. create the Destination-connector-> 3. start the pipeline -> data that was added before starting the pipeline will be transferred | Passed | | +| T3433185 | 1\. Create the Azure Cosmos DB for MongoDB source connector with {"snapshot":"false"}-> 2. create the Destination connector -> 3. start the pipeline -> data that was added before starting the pipeline will not be transferred | Passed | | +| T3433187 | The data from the Stripe source connector is transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3433188 | The data from the NATS PubSub source connector is transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3447676 | The data from the MongoDB source connector is transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3447678 | The data from the Neo4j source connector is transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3433214 | The data from the SAP HANA source connector is transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3433189 | Data from two different Stripe source connectors are transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3433198 | Data from two different NATS PubSub source connectors are transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3447675 | Data from two different MongoDB source connectors are transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3447679 | Data from two different Neo4j source connectors are transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3433215 | Data from two different SAP HANA source connectors are transferred to the Azure Cosmos DB for MongoDB Destination connector | Passed | | +| T3433190 | Transfer data from the Stripe Source to two Azure Cosmos DB for MongoDB Destinations with the same config - >data is transferred and duplicated (is transferred once; an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3433191 | Transfer data from the NATS PubSub Source to two Azure Cosmos DB for MongoDB Destinations with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3447677 | Transfer data from the MongoDB Source to two Azure Cosmos DB for MongoDB Destinations with the same config - > data is transferred and duplicated (is transferred once and an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3433216 | Transfer data from the SAP HANA Source to two Azure Cosmos DB for MongoDB Destinations with the same config - > data is transferred and duplicated (is transferred once; an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3447680 | Transfer data from the Neo4j Source to two Azure Cosmos DB for MongoDB Destinations with the same config - > data is transferred and duplicated (is transferred once and an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3447672 | Try to transfer data from two Source connectors with the same config to the Azure Cosmos DB for MongoDB Destination - > data is transferred and duplicated (is transferred once; an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3447673 | 1\. Create Source and Azure Cosmos DB for MongoDB Dest-> 2. Start pipeline-> the data is recorded to Dest-> 3. Stop pipeline and add new Azure Cosmos DB for MongoDB Dest->4. Start pipeline-> previous data isn't recorded to new Azure Cosmos DB Dest | Passed | | +| T3433192 | The user can edit the Azure Cosmos DB for MongoDB Dest (example: change the name in the "collection" key) and start the pipeline-> new source data is transferred to the new MongoDB collection; data isn't transferred to the previous MongoDB collection | Passed | | +| T3433193 | 1\. Stop pipeline and remove Source-> 2. create Source with same configs from 1st step-> 3.Start pipeline-> data is duplicated to Azure Cosmos DB for MongoDB Dest (isn't transferred and an error occurs if the "_id" field is transferred in the payload) | Passed | | +| T3447674 | The user can transfer data using one pipeline from two (or more) different sources: Stripe, S3, File, Kafka, NATS JS, NATS P/S, HTTP, GCP P/S to two (or more) different Azure Cosmos DB for MongoDB Destination | Passed | | +| T3433194 | The new data that is added to the Source connector is transferred to the Azure Cosmos DB for MongoDB Destination while the pipeline is running (or after restarting the pipeline ) | Passed | | +| T3433195 | Remove data in Stripe (or Vitess) while running pipeline (or restart run pipeline) -> data is removed from the Azure Cosmos DB for MongoDB | Passed | | +| T3433196 | Update data in Stripe (or Vitess) while running pipeline (or restart run pipeline) -> data is updated in the Azure Cosmos DB for MongoDB | Passed | | +| T3433197 | The user can use the Azure Cosmos DB for MongoDB Destination connector with the identical config in two (or more) pipelines at one time | Passed | |