Skip to content

Commit 4f077f0

Browse files
committed
addition of new sysadmin scenarios
1 parent d1d9ff4 commit 4f077f0

8 files changed

Lines changed: 322 additions & 0 deletions

File tree

Lines changed: 162 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,162 @@
1+
#
2+
# Copyright © 2023 Cask Data, Inc.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
5+
# use this file except in compliance with the License. You may obtain a copy of
6+
# the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
12+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
13+
# License for the specific language governing permissions and limitations under
14+
# the License.
15+
#
16+
17+
@Sysadmin
18+
Feature: Sysadmin - Validate system admin page Run time scenarios
19+
20+
@Sysadmin
21+
Scenario:To verify user should be able to create Namespace successfully in System Admin
22+
Given Open Datafusion Project to configure pipeline
23+
When Open "System Admin" menu
24+
Then Click on the Configuration link on the System admin page
25+
Then Click on Create New Namespace button
26+
Then Enter the New Namespace Name with value: "namespaceName"
27+
Then Enter the Namespace Description with value: "validNamespaceDescription"
28+
Then Click on: "Finish" button in the properties
29+
Then Verify the namespace created success message displayed on confirmation window
30+
Then Verify the created namespace: "namespaceName" is displayed in Namespace tab
31+
32+
@SysAdminRequired
33+
Scenario:To verify User should be able to add a secure key from Make HTTP calls successfully with PUT calls
34+
Given Open Datafusion Project to configure pipeline
35+
When Open "System Admin" menu
36+
Then Click on the Configuration link on the System admin page
37+
Then Click on Make HTTP calls from the System admin configuration page
38+
Then Select request dropdown property with option value: "httpPutMethod"
39+
Then Enter input plugin property: "requestPath" with value: "secureKey"
40+
Then Enter textarea plugin property: "requestBody" with value: "bodyValue"
41+
Then Click on send button
42+
Then Verify the status code for success response
43+
44+
@SysAdminRequired
45+
Scenario:To verify User should be able to fetch secure key from Make HTTP calls successfully with GET calls
46+
Given Open Datafusion Project to configure pipeline
47+
When Open "System Admin" menu
48+
Then Click on the Configuration link on the System admin page
49+
Then Click on Make HTTP calls from the System admin configuration page
50+
Then Select request dropdown property with option value: "httpGetMethod"
51+
Then Enter input plugin property: "requestPath" with value: "secureKey"
52+
Then Click on send button
53+
Then Verify the status code for success response
54+
55+
@SysAdminRequired
56+
Scenario:To verify User should be able to delete secure key from Make HTTP calls successfully with DELETE calls
57+
Given Open Datafusion Project to configure pipeline
58+
When Open "System Admin" menu
59+
Then Click on the Configuration link on the System admin page
60+
Then Click on Make HTTP calls from the System admin configuration page
61+
Then Select request dropdown property with option value: "httpDeleteMethod"
62+
Then Enter input plugin property: "requestPath" with value: "secureKey"
63+
Then Click on send button
64+
Then Verify the status code for success response
65+
66+
@BQ_SOURCE_TEST @BQ_SINK_TEST @SysAdminRequired
67+
Scenario:To verify user should be able to run a pipeline successfully using the System preferences created
68+
Given Open Datafusion Project to configure pipeline
69+
When Open "System Admin" menu
70+
Then Click on the Configuration link on the System admin page
71+
Then Select "systemPreferences" option from Configuration page
72+
Then Click on edit system preferences
73+
Then Set system preferences with key: "keyValue" and value: "systemPreferences2"
74+
Then Click on the Save & Close preferences button
75+
Then Click on the Hamburger menu on the left panel
76+
Then Select navigation item: "studio" from the Hamburger menu list
77+
When Select plugin: "BigQuery" from the plugins list as: "Source"
78+
When Expand Plugin group in the LHS plugins list: "Sink"
79+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
80+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
81+
Then Navigate to the properties page of plugin: "BigQuery"
82+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
83+
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
84+
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetProjectId"
85+
Then Enter input plugin property: "dataset" with value: "dataset"
86+
Then Enter input plugin property: "table" with value: "bqSourceTable"
87+
Then Validate "BigQuery" plugin properties
88+
Then Close the Plugin Properties page
89+
Then Navigate to the properties page of plugin: "BigQuery2"
90+
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
91+
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetProjectId"
92+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
93+
Then Enter input plugin property: "dataset" with value: "dataset"
94+
Then Enter input plugin property: "table" with value: "bqSourceTable"
95+
Then Validate "BigQuery" plugin properties
96+
Then Close the Plugin Properties page
97+
Then Save the pipeline
98+
Then Deploy the pipeline
99+
Then Run the Pipeline in Runtime
100+
Then Wait till pipeline is in running state
101+
Then Open and capture logs
102+
Then Verify the pipeline status is "Succeeded"
103+
104+
@BQ_SOURCE_TEST @BQ_SINK_TEST @SysAdminRequired
105+
Scenario:To verify user should be able to run a pipeline successfully using the Namespace preferences created
106+
Given Open Datafusion Project to configure pipeline
107+
When Open "System Admin" menu
108+
Then Click on the Configuration link on the System admin page
109+
Then Click on Create New Namespace button
110+
Then Enter the New Namespace Name with value: "sampleNamespaceName"
111+
Then Enter the Namespace Description with value: "validNamespaceDescription"
112+
Then Click on: "Finish" button in the properties
113+
Then Verify the namespace created success message displayed on confirmation window
114+
Then Click on the switch to namespace button
115+
Then Click on the Hamburger menu on the left panel
116+
Then Select navigation item: "namespaceAdmin" from the Hamburger menu list
117+
Then Click "preferences" tab from Configuration page for "sampleNamespaceName" Namespace
118+
Then Click on edit namespace preferences to set namespace preferences
119+
Then Set system preferences with key: "keyValue" and value: "systemPreferences1"
120+
Then Click on the Save & Close preferences button
121+
Then Click on the Hamburger menu on the left panel
122+
Then Select navigation item: "studio" from the Hamburger menu list
123+
When Select plugin: "BigQuery" from the plugins list as: "Source"
124+
When Expand Plugin group in the LHS plugins list: "Sink"
125+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
126+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
127+
Then Navigate to the properties page of plugin: "BigQuery"
128+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
129+
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
130+
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetProjectId"
131+
Then Click on the Macro button of Property: "dataset" and set the value to: "dataset"
132+
Then Enter input plugin property: "table" with value: "bqSourceTable"
133+
Then Validate "BigQuery" plugin properties
134+
Then Close the Plugin Properties page
135+
Then Navigate to the properties page of plugin: "BigQuery2"
136+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
137+
Then Click on the Macro button of Property: "projectId" and set the value to: "projectId"
138+
Then Click on the Macro button of Property: "datasetProjectId" and set the value to: "datasetProjectId"
139+
Then Click on the Macro button of Property: "dataset" and set the value to: "dataset"
140+
Then Enter input plugin property: "table" with value: "bqSourceTable"
141+
Then Validate "BigQuery" plugin properties
142+
Then Close the Plugin Properties page
143+
Then Save the pipeline
144+
Then Deploy the pipeline
145+
Then Run the Pipeline in Runtime
146+
Then Wait till pipeline is in running state
147+
Then Open and capture logs
148+
Then Verify the pipeline status is "Succeeded"
149+
Then Close the pipeline logs
150+
When Open "System Admin" menu
151+
Then Click on the Configuration link on the System admin page
152+
Then Select "systemPreferences" option from Configuration page
153+
Then Click on edit system preferences
154+
Then Delete the preferences
155+
Then Delete the preferences
156+
Then Click on the Save & Close preferences button
157+
Then Click on the Hamburger menu on the left panel
158+
Then Select navigation item: "namespaceAdmin" from the Hamburger menu list
159+
Then Click "preferences" tab from Configuration page for "sampleNamespaceName" Namespace
160+
Then Click on edit namespace preferences to set namespace preferences
161+
Then Delete the preferences
162+
Then Click on the Save & Close preferences button
Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
/*
2+
* Copyright © 2023 Cask Data, Inc.
3+
*
4+
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
5+
* use this file except in compliance with the License. You may obtain a copy of
6+
* the License at
7+
*
8+
* http://www.apache.org/licenses/LICENSE-2.0
9+
*
10+
* Unless required by applicable law or agreed to in writing, software
11+
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
12+
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
13+
* License for the specific language governing permissions and limitations under
14+
* the License.
15+
*/
16+
package io.cdap.cdap.common.common;
17+
18+
import io.cucumber.junit.Cucumber;
19+
import io.cucumber.junit.CucumberOptions;
20+
import org.junit.runner.RunWith;
21+
22+
/**
23+
* Test Runner to execute required test cases. Add @SysAdminRequired tag on the scenario.
24+
*/
25+
@RunWith(Cucumber.class)
26+
@CucumberOptions(
27+
features = {"src/e2e-test/features"},
28+
glue = {"stepsdesign", "io.cdap.cdap.common.stepsdesign"},
29+
tags = {"@SysAdminRequired"},
30+
plugin = {"pretty", "html:target/cucumber-html-report/required",
31+
"json:target/cucumber-reports/cucumber-required.json",
32+
"junit:target/cucumber-reports/cucumber-required.xml"},
33+
monochrome = true
34+
)
35+
public class TestRunnerRequired {
36+
}
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
/**
2+
* Package contains the common runners.
3+
*/
4+
package io.cdap.cdap.common.common;
Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
/*
2+
* Copyright © 2023 Cask Data, Inc.
3+
*
4+
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
5+
* use this file except in compliance with the License. You may obtain a copy of
6+
* the License at
7+
*
8+
* http://www.apache.org/licenses/LICENSE-2.0
9+
*
10+
* Unless required by applicable law or agreed to in writing, software
11+
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
12+
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
13+
* License for the specific language governing permissions and limitations under
14+
* the License.
15+
*/
16+
package io.cdap.cdap.common.stepsdesign;
17+
18+
import com.google.cloud.bigquery.BigQueryException;
19+
import io.cdap.e2e.utils.BigQueryClient;
20+
import io.cdap.e2e.utils.PluginPropertyUtils;
21+
import io.cucumber.java.After;
22+
import io.cucumber.java.Before;
23+
import java.io.IOException;
24+
import java.util.UUID;
25+
import org.apache.commons.lang3.StringUtils;
26+
import org.junit.Assert;
27+
import stepsdesign.BeforeActions;
28+
29+
/**
30+
* GCP test hooks.
31+
*/
32+
public class TestSetupHooks {
33+
34+
public static String bqTargetTable = StringUtils.EMPTY;
35+
public static String bqSourceTable = StringUtils.EMPTY;
36+
public static String datasetName = PluginPropertyUtils.pluginProp("dataset");
37+
38+
@Before(order = 1, value = "@BQ_SINK_TEST")
39+
public static void setTempTargetBQTableName() {
40+
bqTargetTable = "E2E_TARGET_" + UUID.randomUUID().toString().replaceAll("-", "_");
41+
PluginPropertyUtils.addPluginProp("bqTargetTable", bqTargetTable);
42+
BeforeActions.scenario.write("BQ Target table name - " + bqTargetTable);
43+
}
44+
45+
@After(order = 1, value = "@BQ_SINK_TEST")
46+
public static void deleteTempTargetBQTable() throws IOException, InterruptedException {
47+
try {
48+
BigQueryClient.dropBqQuery(bqTargetTable);
49+
PluginPropertyUtils.removePluginProp("bqTargetTable");
50+
BeforeActions.scenario.write("BQ Target table - " + bqTargetTable + " deleted successfully");
51+
bqTargetTable = StringUtils.EMPTY;
52+
} catch (BigQueryException e) {
53+
if (e.getMessage().contains("Not found: Table")) {
54+
BeforeActions.scenario.write("BQ Target Table " + bqTargetTable + " does not exist");
55+
} else {
56+
Assert.fail(e.getMessage());
57+
}
58+
}
59+
}
60+
61+
/**
62+
* Create BigQuery table with 3 columns (Id - Int, Value - Int, UID - string) containing random testdata.
63+
* Sample row:
64+
* Id | Value | UID
65+
* 22 | 968 | 245308db-6088-4db2-a933-f0eea650846a
66+
*/
67+
@Before(order = 1, value = "@BQ_SOURCE_TEST")
68+
public static void createTempSourceBQTable() throws IOException, InterruptedException {
69+
bqSourceTable = "E2E_SOURCE_" + UUID.randomUUID().toString().replaceAll("-", "_");
70+
StringBuilder records = new StringBuilder(StringUtils.EMPTY);
71+
for (int index = 2; index <= 25; index++) {
72+
records.append(" (").append(index).append(", ").append((int) (Math.random() * 1000 + 1)).append(", '")
73+
.append(UUID.randomUUID()).append("'), ");
74+
}
75+
BigQueryClient.getSoleQueryResult("create table `" + datasetName + "." + bqSourceTable + "` as " +
76+
"SELECT * FROM UNNEST([ " +
77+
" STRUCT(1 AS Id, " + ((int) (Math.random() * 1000 + 1)) + " as Value, " +
78+
"'" + UUID.randomUUID() + "' as UID), " +
79+
records +
80+
" (26, " + ((int) (Math.random() * 1000 + 1)) + ", " +
81+
"'" + UUID.randomUUID() + "') " +
82+
"])");
83+
PluginPropertyUtils.addPluginProp("bqSourceTable", bqSourceTable);
84+
BeforeActions.scenario.write("BQ source Table " + bqSourceTable + " created successfully");
85+
}
86+
87+
@After(order = 1, value = "@BQ_SOURCE_TEST")
88+
public static void deleteTempSourceBQTable() throws IOException, InterruptedException {
89+
BigQueryClient.dropBqQuery(bqSourceTable);
90+
PluginPropertyUtils.removePluginProp("bqSourceTable");
91+
BeforeActions.scenario.write("BQ source Table " + bqSourceTable + " deleted successfully");
92+
bqSourceTable = StringUtils.EMPTY;
93+
}
94+
}
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
/**
2+
* Package contains the stepDesign for the common features.
3+
*/
4+
package io.cdap.cdap.common.stepsdesign;
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
validationResetSuccessMessage=Reset Successful
2+
errorInvalidClusterName=Unable to get credentials from the environment. Please explicitly set the account key.
3+
errorInvalidProfileName=Invalid profile ID: 6*&gjh879. Should only contain alphanumeric characters and _ or -.
4+
errorInvalidNamespace=Failed to Add namespace
5+
validationSuccessMessage=No errors found.
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
requestMethod=request-method-selector
2+
requestPath=request-path-input
3+
requestBody=request-body
4+
studio=pipeline-studio
5+
namespaceAdmin=project-admin
6+
projectId=project
7+
datasetProjectId=datasetProject

cdap-e2e-tests/src/e2e-test/resources/pluginParameters.properties

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,3 +3,13 @@ clientUrl=http://localhost:11011
33
serverUrl=https://placeholder.com/api
44
# command to generate token: gcloud auth print-access-token
55
serverAccessToken=placeholder
6+
7+
namespaceName=TestNamespace
8+
sampleNamespaceName=sampleNamespace
9+
secureKey=namespaces/default/securekeys/mytestkey
10+
bodyValue={ "description": "Example Secure Key","data": "test123","properties": { "<property-key>": "<property-value>" } }
11+
httpPutMethod=PUT
12+
httpGetMethod=GET
13+
httpDeleteMethod=DELETE
14+
projectId=cdf-athena
15+
dataset=test_automation

0 commit comments

Comments
 (0)