Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ Feature: CloudSQL-postgreSQL source - Verify CloudSQL-postgreSQL source plugin
Then Click on the Validate button
Then Verify mandatory property error for below listed properties:
| jdbcPluginName |
| connectionName |
| database |
| referenceName |
| importQuery |
Expand Down Expand Up @@ -228,3 +229,19 @@ Feature: CloudSQL-postgreSQL source - Verify CloudSQL-postgreSQL source plugin
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Validate button
Then Verify that the Plugin is displaying an error message: "errorMessageInvalidPassword" on the header

Scenario: To verify CloudSQLPostgreSQL source plugin validation error message with invalid connection name with private instance
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source"
Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Select radio button plugin property: "instanceType" with value: "private"
Then Replace input plugin property: "connectionName" with value: "invalidConnectionName"
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Replace input plugin property: "database" with value: "databaseName"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Validate button
Then Verify that the Plugin Property: "connectionName" is displaying an in-line error message: "errorMessagePrivateConnectionName"
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Feature: CloudSQL-PostGreSQL Source - Run Time scenarios
Then Validate the values of records transferred to target Big Query table is equal to the values from source table

@CLOUDSQLPOSTGRESQL_SOURCE_TEST @BQ_SINK_TEST @PLUGIN-1526
Scenario: To verify data is getting transferred from PostgreSQL source to BigQuery sink successfully when connection arguments are set
Scenario: To verify data is getting transferred from CloudSQLPostgreSQL source to BigQuery sink successfully when connection arguments are set
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source"
Expand Down Expand Up @@ -188,11 +188,106 @@ Feature: CloudSQL-PostGreSQL Source - Run Time scenarios
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Wait till pipeline preview is in running state and check if any error occurs
Then Verify the preview run status of pipeline in the logs is "failed"

@CLOUDSQLPOSTGRESQL_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TARGET_TEST
Scenario: To verify data is getting transferred from PostgreSQL to PostgreSQL successfully with supported datatypes
Scenario: To verify data is getting transferred from CloudSQLPostgreSQL to CloudSQLPostgreSQL successfully with supported datatypes
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink"
Then Connect plugins: "CloudSQL PostgreSQL" and "CloudSQL PostgreSQL2" to establish connection
Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Select radio button plugin property: "instanceType" with value: "public"
Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Replace input plugin property: "database" with value: "databaseName"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "datatypesSchema"
Then Validate "CloudSQL PostgreSQL" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL2"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Select radio button plugin property: "instanceType" with value: "public"
Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields
Then Replace input plugin property: "database" with value: "databaseName"
Then Replace input plugin property: "tableName" with value: "targetTable"
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Enter input plugin property: "referenceName" with value: "targetRef"
Then Replace input plugin property: "dbSchemaName" with value: "schema"
Then Validate "CloudSQL PostgreSQL2" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
Then Click on the Preview Data link on the Sink plugin node: "CloudSQLPostgreSQL"
Then Close the preview data
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Validate the values of records transferred to target table is equal to the values from source table

@CLOUDSQLPOSTGRESQL_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TARGET_TEST @CONNECTION @Source_Required
Scenario: To verify data is getting transferred from CloudSQLPostgreSQL to CloudSQLPostgreSQL successfully with use connection
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink"
Then Connect plugins: "CloudSQL PostgreSQL" and "CloudSQL PostgreSQL2" to establish connection
Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL"
And Click plugin property: "switch-useConnection"
And Click on the Browse Connections button
And Click on the Add Connection button
Then Click plugin property: "connector-CloudSQLPostgreSQL"
And Enter input plugin property: "name" with value: "connection.name"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Replace input plugin property: "database" with value: "databaseName"
Then Select radio button plugin property: "instanceType" with value: "public"
Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Click on the Test Connection button
And Verify the test connection is successful
Then Click on the Create button
Then Select connection: "connection.name"
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "datatypesSchema"
Then Validate "CloudSQL PostgreSQL" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL2"
And Click plugin property: "switch-useConnection"
And Click on the Browse Connections button
Then Select connection: "connection.name"
Then Enter input plugin property: "referenceName" with value: "targetRef"
Then Replace input plugin property: "tableName" with value: "targetTable"
Then Replace input plugin property: "dbSchemaName" with value: "schema"
Then Validate "CloudSQL PostgreSQL2" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Verify the preview of pipeline is "success"
And Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Validate the values of records transferred to target table is equal to the values from source table

@CLOUDSQLPOSTGRESQL_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TARGET_TEST @Source_Required
Scenario: To verify data is getting transferred from CloudSQLPostgreSQL to CloudSQLPostgreSQL successfully with bounding query
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source"
Expand All @@ -206,6 +301,7 @@ Feature: CloudSQL-PostGreSQL Source - Run Time scenarios
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Enter textarea plugin property: "boundingQuery" with value: "boundingQuery"
Then Replace input plugin property: "database" with value: "databaseName"
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
Then Click on the Get Schema button
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -332,3 +332,66 @@ Feature: CloudSQL-PostGreSQL source - Verify CloudSQL-PostGreSQL plugin data tra
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred to target Big Query table is equal to the values from source table

@CLOUDSQLPOSTGRESQL_SOURCE_TEST @CLOUDSQLPOSTGRESQL_TARGET_TEST @Source_Required
Scenario: To verify data is getting transferred from CloudSQLPostgreSQL to CloudSQLPostgreSQL successfully when macro enabled
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "CloudSQL PostgreSQL" from the plugins list as: "Sink"
Then Connect plugins: "CloudSQL PostgreSQL" and "CloudSQL PostgreSQL2" to establish connection
Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Select radio button plugin property: "instanceType" with value: "public"
Then Click on the Macro button of Property: "connectionName" and set the value to: "cloudSQLPostgreSQLConnectionName"
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Click on the Macro button of Property: "connectionArguments" and set the value to: "connArgumentsSource"
Then Click on the Macro button of Property: "database" and set the value to: "cloudSQLPostgreSQLdatabaseName"
Then Click on the Macro button of Property: "importQuery" and set the value in textarea: "cloudSQLPostgreSQLImportQuery"
Then Enter input plugin property: "referenceName" with value: "sourceRef"
Then Click on the Macro button of Property: "boundingQuery" and set the value in textarea: "cloudSQLPostgreSQLBoundingQuery"
Then Validate "CloudSQL PostgreSQL" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "CloudSQL PostgreSQL2"
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
Then Select radio button plugin property: "instanceType" with value: "public"
Then Replace input plugin property: "connectionName" with value: "connectionName" for Credentials and Authorization related fields
Then Replace input plugin property: "database" with value: "databaseName"
Then Replace input plugin property: "tableName" with value: "targetTable"
Then Replace input plugin property: "dbSchemaName" with value: "schema"
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
Then Click on the Macro button of Property: "connectionArguments" and set the value to: "connArgumentsSink"
Then Enter input plugin property: "referenceName" with value: "targetRef"
Then Validate "CloudSQL PostgreSQL2" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Enter runtime argument value "connectionArguments" for key "connArgumentsSource"
Then Enter runtime argument value "connectionArguments" for key "connArgumentsSink"
Then Enter runtime argument value "selectQuery" for key "cloudSQLPostgreSQLImportQuery"
Then Enter runtime argument value "databaseName" for key "cloudSQLPostgreSQLdatabaseName"
Then Enter runtime argument value "boundingQuery" for key "cloudSQLPostgreSQLBoundingQuery"
Then Enter runtime argument value from environment variable "connectionName" for key "cloudSQLPostgreSQLConnectionName"
Then Run the preview of pipeline with runtime arguments
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Enter runtime argument value "connectionArguments" for key "connArgumentsSource"
Then Enter runtime argument value "connectionArguments" for key "connArgumentsSink"
Then Enter runtime argument value "selectQuery" for key "cloudSQLPostgreSQLImportQuery"
Then Enter runtime argument value "databaseName" for key "cloudSQLPostgreSQLdatabaseName"
Then Enter runtime argument value "boundingQuery" for key "cloudSQLPostgreSQLBoundingQuery"
Then Enter runtime argument value from environment variable "connectionName" for key "cloudSQLPostgreSQLConnectionName"
Then Run the Pipeline in Runtime with runtime arguments
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred to target table is equal to the values from source table
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@
package io.cdap.plugin.common.stepsdesign;

import com.google.cloud.bigquery.BigQueryException;
import io.cdap.e2e.pages.actions.CdfConnectionActions;
import io.cdap.e2e.pages.actions.CdfPluginPropertiesActions;
import io.cdap.e2e.utils.BigQueryClient;
import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cdap.plugin.CloudSqlPostgreSqlClient;
Expand Down Expand Up @@ -49,8 +51,10 @@ public static void setTableName() {
PluginPropertyUtils.addPluginProp("sourceTable", sourceTableName);
PluginPropertyUtils.addPluginProp("targetTable", targetTableName);
String schema = PluginPropertyUtils.pluginProp("schema");
PluginPropertyUtils.addPluginProp("selectQuery",
String.format("select * from %s.%s", schema, sourceTableName));
PluginPropertyUtils.addPluginProp("selectQuery", String.format("select * from %s.%s"
+ " WHERE $CONDITIONS", schema, sourceTableName));
PluginPropertyUtils.addPluginProp("boundingQuery", String.format("select MIN(id),MAX(id)"
+ " from %s.%s", schema, sourceTableName));
}

@Before(order = 2, value = "@CLOUDSQLPOSTGRESQL_SOURCE_TEST")
Expand Down Expand Up @@ -177,4 +181,25 @@ private static void createSourceBQTableWithQueries(String bqCreateTableQueryFile
PluginPropertyUtils.addPluginProp("bqSourceTable", bqSourceTable);
BeforeActions.scenario.write("BQ Source Table " + bqSourceTable + " created successfully");
}

@Before(order = 1, value = "@CONNECTION")
public static void setNewConnectionName() {
String connectionName = "CloudSQLPostgreSQL" + RandomStringUtils.randomAlphanumeric(10);
PluginPropertyUtils.addPluginProp("connection.name", connectionName);
BeforeActions.scenario.write("New Connection name: " + connectionName);
}

private static void deleteConnection(String connectionType, String connectionName) throws IOException {
CdfConnectionActions.openWranglerConnectionsPage();
CdfConnectionActions.expandConnections(connectionType);
CdfConnectionActions.openConnectionActionMenu(connectionType, connectionName);
CdfConnectionActions.selectConnectionAction(connectionType, connectionName, "Delete");
CdfPluginPropertiesActions.clickPluginPropertyButton("Delete");
}

@After(order = 1, value = "@CONNECTION")
public static void deleteTestConnection() throws IOException {
deleteConnection("CloudSQLPostgreSQL", "connection.name");
PluginPropertyUtils.removePluginProp("connection.name");
}
}
Loading
Loading