diff --git a/cloudsql-postgresql-plugin/src/e2e-test/features/sink/DesignTimeWithValidation.feature b/cloudsql-postgresql-plugin/src/e2e-test/features/sink/DesignTimeWithValidation.feature index a1e74b079..7a0b62e5b 100644 --- a/cloudsql-postgresql-plugin/src/e2e-test/features/sink/DesignTimeWithValidation.feature +++ b/cloudsql-postgresql-plugin/src/e2e-test/features/sink/DesignTimeWithValidation.feature @@ -25,6 +25,7 @@ Feature: CloudSQL-PostgreSQL Sink - Verify CloudSQL-postgreSQL Sink Plugin Error Then Click on the Validate button Then Verify mandatory property error for below listed properties: | jdbcPluginName | + | connectionName | | referenceName | | database | | tableName | @@ -141,3 +142,135 @@ Feature: CloudSQL-PostgreSQL Sink - Verify CloudSQL-postgreSQL Sink Plugin Error Then Replace input plugin property: "tableName" with value: "targetTable" Then Click on the Validate button Then Verify that the Plugin Property: "user" is displaying an in-line error message: "errorMessageBlankUsername" + + @Sink_Required + Scenario: To verify CloudSQLPostgreSQL sink plugin validation error message with invalid private connection name + Given Open Datafusion Project to configure pipeline + When Expand Plugin group in the LHS plugins list: "Sink" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink" + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "private" + Then Replace input plugin property: "connectionName" with value: "invalidConnectionName" + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "targetRef" + Then Replace input plugin property: "database" with value: "databaseName" + Then Replace input plugin property: "tableName" with value: "targetTable" + Then Click on the Validate button + Then Verify that the Plugin Property: "connectionName" is displaying an in-line error message: "errorMessagePrivateConnectionName" + + @CLOUDSQLPOSTGRESQL_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TARGET_TEST @Sink_Required + Scenario: To verify CloudSQLPostgreSQL sink plugin validation error message with blank password + Given Open Datafusion Project to configure pipeline + When Expand Plugin group in the LHS plugins list: "Source" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source" + When Expand Plugin group in the LHS plugins list: "Sink" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink" + Then Connect plugins: "CloudSQL PostgreSQL" and "CloudSQL PostgreSQL2" to establish connection + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "sourceRef" + Then Replace input plugin property: "database" with value: "databaseName" + Then Enter textarea plugin property: "importQuery" with value: "selectQuery" + Then Click on the Get Schema button + Then Verify the Output Schema matches the Expected Schema: "datatypesSchema" + Then Validate "CloudSQL PostgreSQL" plugin properties + Then Close the Plugin Properties page + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL2" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "targetRef" + Then Replace input plugin property: "database" with value: "databaseName" + Then Replace input plugin property: "tableName" with value: "targetTable" + Then Click on the Validate button + Then Verify that the Plugin is displaying an error message: "errorMessageWithBlankPassword" on the header + + @CLOUDSQLPOSTGRESQL_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TARGET_TEST @Sink_Required + Scenario Outline: To verify CloudSQLPostgreSQL sink plugin validation error message for update, upsert operation name and table key + Given Open Datafusion Project to configure pipeline + When Expand Plugin group in the LHS plugins list: "Source" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source" + When Expand Plugin group in the LHS plugins list: "Sink" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink" + Then Connect plugins: "CloudSQL PostgreSQL" and "CloudSQL PostgreSQL2" to establish connection + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "sourceRef" + Then Replace input plugin property: "database" with value: "databaseName" + Then Enter textarea plugin property: "importQuery" with value: "selectQuery" + Then Click on the Get Schema button + Then Verify the Output Schema matches the Expected Schema: "datatypesSchema" + Then Validate "CloudSQL PostgreSQL" plugin properties + Then Close the Plugin Properties page + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL2" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "database" with value: "databaseName" + Then Replace input plugin property: "tableName" with value: "targetTable" + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "targetRef" + Then Replace input plugin property: "dbSchemaName" with value: "schema" + Then Select radio button plugin property: "operationName" with value: "" + Then Click on the Validate button + Then Verify that the Plugin Property: "operationName" is displaying an in-line error message: "errorMessageUpdateUpsertOperationName" + Then Verify that the Plugin Property: "relationTableKey" is displaying an in-line error message: "errorMessageUpdateUpsertOperationName" + Examples: + | options | + | upsert | + | update | + + @CLOUDSQLPOSTGRESQL_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TARGET_TEST @Sink_Required + Scenario Outline: To verify pipeline preview failed with invalid table key + Given Open Datafusion Project to configure pipeline + When Expand Plugin group in the LHS plugins list: "Source" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source" + When Expand Plugin group in the LHS plugins list: "Sink" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink" + Then Connect plugins: "CloudSQL PostgreSQL" and "CloudSQL PostgreSQL2" to establish connection + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "sourceRef" + Then Replace input plugin property: "database" with value: "databaseName" + Then Enter textarea plugin property: "importQuery" with value: "selectQuery" + Then Click on the Get Schema button + Then Verify the Output Schema matches the Expected Schema: "datatypesSchema" + Then Validate "CloudSQL PostgreSQL" plugin properties + Then Close the Plugin Properties page + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL2" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "database" with value: "databaseName" + Then Replace input plugin property: "tableName" with value: "targetTable" + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "targetRef" + Then Replace input plugin property: "dbSchemaName" with value: "schema" + Then Select radio button plugin property: "operationName" with value: "" + Then Click on the Add Button of the property: "relationTableKey" with value: + | invalidCloudSQLPostgreSQLTableKey | + Then Close the Plugin Properties page + Then Save the pipeline + Then Preview and run the pipeline + Then Verify the preview of pipeline is "Failed" + Examples: + | options | + | upsert | + | update | diff --git a/cloudsql-postgresql-plugin/src/e2e-test/features/sink/RunTime.feature b/cloudsql-postgresql-plugin/src/e2e-test/features/sink/RunTime.feature index 5ba1b9fab..d0b2fab32 100644 --- a/cloudsql-postgresql-plugin/src/e2e-test/features/sink/RunTime.feature +++ b/cloudsql-postgresql-plugin/src/e2e-test/features/sink/RunTime.feature @@ -144,3 +144,104 @@ Feature: CloudSQL-PostgreSQL sink - Verify data transfer from BigQuery source to Then Open and capture logs Then Verify the pipeline status is "Succeeded" Then Validate the values of records transferred to target CloudPostgreSQL table is equal to the values from BigQuery table + + @BQ_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TEST_TABLE @Sink_Required + Scenario Outline: To verify data is getting transferred from BigQuery to CloudSQLPostgreSQL successfully using upsert,update operation with table key + Given Open Datafusion Project to configure pipeline + When Expand Plugin group in the LHS plugins list: "Source" + When Select plugin: "BigQuery" from the plugins list as: "Source" + When Expand Plugin group in the LHS plugins list: "Sink" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink" + Then Connect plugins: "BigQuery" and "CloudSQL PostgreSQL" to establish connection + Then Navigate to the properties page of plugin: "BigQuery" + Then Replace input plugin property: "project" with value: "projectId" + Then Enter input plugin property: "datasetProject" with value: "projectId" + Then Enter input plugin property: "referenceName" with value: "BQReferenceName" + Then Enter input plugin property: "dataset" with value: "dataset" + Then Enter input plugin property: "table" with value: "bqSourceTable" + Then Click on the Get Schema button + Then Verify the Output Schema matches the Expected Schema: "bqOutputMultipleDatatypesSchema" + Then Validate "BigQuery" plugin properties + Then Close the Plugin Properties page + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "database" with value: "databaseName" + Then Replace input plugin property: "tableName" with value: "targetTable" + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "targetRef" + Then Replace input plugin property: "dbSchemaName" with value: "schema" + Then Select radio button plugin property: "operationName" with value: "" + Then Click on the Add Button of the property: "relationTableKey" with value: + | CloudSQLPostgreSQLTableKey | + Then Validate "CloudSQL PostgreSQL2" plugin properties + Then Close the Plugin Properties page + Then Save the pipeline + Then Preview and run the pipeline + Then Verify the preview of pipeline is "success" + Then Click on the Preview Data link on the Sink plugin node: "CloudSQLPostgreSQL" + Then Close the preview data + Then Deploy the pipeline + Then Run the Pipeline in Runtime + Then Wait till pipeline is in running state + Then Open and capture logs + Then Verify the pipeline status is "Succeeded" + Then Validate the values of records transferred to target CloudPostgreSQL table is equal to the values from BigQuery table + Examples: + | options | + | upsert | + | update | + + @CLOUDSQLPOSTGRESQL_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TARGET_TEST @Sink_Required + Scenario Outline: To verify data is getting transferred from CloudSQLPostgreSQL source to CloudSQLPostgreSQL sink with different isolation levels + Given Open Datafusion Project to configure pipeline + When Expand Plugin group in the LHS plugins list: "Source" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source" + When Expand Plugin group in the LHS plugins list: "Sink" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink" + Then Connect plugins: "CloudSQL PostgreSQL" and "CloudSQL PostgreSQL2" to establish connection + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "sourceRef" + Then Replace input plugin property: "database" with value: "databaseName" + Then Enter textarea plugin property: "importQuery" with value: "selectQuery" + Then Click on the Get Schema button + Then Verify the Output Schema matches the Expected Schema: "datatypesSchema" + Then Validate "CloudSQL PostgreSQL" plugin properties + Then Close the Plugin Properties page + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL2" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields + Then Replace input plugin property: "database" with value: "databaseName" + Then Replace input plugin property: "tableName" with value: "targetTable" + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Enter input plugin property: "referenceName" with value: "targetRef" + Then Replace input plugin property: "dbSchemaName" with value: "schema" + Then Select dropdown plugin property: "transactionIsolationLevel" with option value: "" + Then Validate "CloudSQL PostgreSQL" plugin properties + Then Close the Plugin Properties page + Then Save the pipeline + Then Preview and run the pipeline + Then Verify the preview of pipeline is "success" + Then Click on the Preview Data link on the Sink plugin node: "CloudSQLPostgreSQL" + Then Close the preview data + Then Deploy the pipeline + Then Run the Pipeline in Runtime + Then Wait till pipeline is in running state + Then Open and capture logs + Then Verify the pipeline status is "Succeeded" + Then Validate the values of records transferred to target table is equal to the values from source table + Examples: + | TransactionIsolationLevels | + | TRANSACTION_REPEATABLE_READ | + | TRANSACTION_READ_UNCOMMITTED | + | TRANSACTION_READ_COMMITTED | + | TRANSACTION_SERIALIZABLE | diff --git a/cloudsql-postgresql-plugin/src/e2e-test/features/sink/RunTimeMacro.feature b/cloudsql-postgresql-plugin/src/e2e-test/features/sink/RunTimeMacro.feature index 336b8d85d..ba96f65d8 100644 --- a/cloudsql-postgresql-plugin/src/e2e-test/features/sink/RunTimeMacro.feature +++ b/cloudsql-postgresql-plugin/src/e2e-test/features/sink/RunTimeMacro.feature @@ -132,3 +132,60 @@ Feature: CloudSQL-PostgreSQL sink - Verify data transfer to PostgreSQL sink with Then Verify the pipeline status is "Succeeded" Then Close the pipeline logs Then Validate the values of records transferred to target CloudPostgreSQL table is equal to the values from BigQuery table + + @BQ_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TEST_TABLE @Sink_Required + Scenario: To verify data is getting transferred from BigQuery to CloudSQLPostgreSQL successfully when macro enabled + Given Open Datafusion Project to configure pipeline + When Expand Plugin group in the LHS plugins list: "Source" + When Select plugin: "BigQuery" from the plugins list as: "Source" + When Expand Plugin group in the LHS plugins list: "Sink" + When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink" + Then Connect plugins: "BigQuery" and "CloudSQL PostgreSQL" to establish connection + Then Navigate to the properties page of plugin: "BigQuery" + Then Replace input plugin property: "project" with value: "projectId" + Then Enter input plugin property: "datasetProject" with value: "projectId" + Then Enter input plugin property: "referenceName" with value: "BQReferenceName" + Then Enter input plugin property: "dataset" with value: "dataset" + Then Enter input plugin property: "table" with value: "bqSourceTable" + Then Click on the Get Schema button + Then Verify the Output Schema matches the Expected Schema: "bqOutputMultipleDatatypesSchema" + Then Validate "BigQuery" plugin properties + Then Close the Plugin Properties page + Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL" + Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName" + Then Select radio button plugin property: "instanceType" with value: "public" + Then Click on the Macro button of Property: "connectionName" and set the value to: "cloudSQLPostgreSQLConnectionName" + Then Click on the Macro button of Property: "database" and set the value to: "cloudSQLPostgreSQLDatabaseName" + Then Replace input plugin property: "tableName" with value: "targetTable" + Then Replace input plugin property: "dbSchemaName" with value: "schema" + Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields + Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields + Then Click on the Macro button of Property: "operationName" and set the value to: "CloudSQLPostgreSQLOperationName" + Then Click on the Macro button of Property: "relationTableKey" and set the value to: "CloudSQLPostgreSQLTableKey" + Then Enter input plugin property: "referenceName" with value: "targetRef" + Then Validate "CloudSQL PostgreSQL2" plugin properties + Then Close the Plugin Properties page + Then Save the pipeline + Then Preview and run the pipeline + Then Enter runtime argument value "databaseName" for key "cloudSQLPostgreSQLDatabaseName" + Then Enter runtime argument value from environment variable "connectionName" for key "cloudSQLPostgreSQLConnectionName" + Then Enter runtime argument value "operationName" for key "CloudSQLPostgreSQLOperationName" + Then Enter runtime argument value "relationTableKey" for key "CloudSQLPostgreSQLTableKey" + Then Run the preview of pipeline with runtime arguments + Then Wait till pipeline preview is in running state + Then Open and capture pipeline preview logs + Then Verify the preview run status of pipeline in the logs is "succeeded" + Then Close the pipeline logs + Then Close the preview + Then Deploy the pipeline + Then Run the Pipeline in Runtime + Then Enter runtime argument value "databaseName" for key "cloudSQLPostgreSQLDatabaseName" + Then Enter runtime argument value from environment variable "connectionName" for key "cloudSQLPostgreSQLConnectionName" + Then Enter runtime argument value "operationName" for key "CloudSQLPostgreSQLOperationName" + Then Enter runtime argument value "relationTableKey" for key "CloudSQLPostgreSQLTableKey" + Then Run the Pipeline in Runtime with runtime arguments + Then Wait till pipeline is in running state + Then Open and capture logs + Then Verify the pipeline status is "Succeeded" + Then Close the pipeline logs + Then Validate the values of records transferred to target CloudPostgreSQL table is equal to the values from BigQuery table diff --git a/cloudsql-postgresql-plugin/src/e2e-test/resources/errorMessage.properties b/cloudsql-postgresql-plugin/src/e2e-test/resources/errorMessage.properties index f780ab75f..7e9cd2337 100644 --- a/cloudsql-postgresql-plugin/src/e2e-test/resources/errorMessage.properties +++ b/cloudsql-postgresql-plugin/src/e2e-test/resources/errorMessage.properties @@ -21,3 +21,5 @@ errorMessageConnectionName=Connection Name must be in the format :