Skip to content

Commit d0ace8f

Browse files
authored
feat(integ-runner): test update path when running tests (#19915)
This PR adds an "update path" workflow to the integration test workflow. When a change is being made to an existing integration test we want to be able to ensure that the change not only works for _new_ stacks, but also with _existing_ stacks. In order to accomplish this, the runner will first deploy the existing snapshot (i.e. `cdk deploy --app /path/to/snapshot`) and then perform a stack update by deploying the integration test. The update path can be disabled by passing the `--disable-update-path` command line option. This PR adds a couple of pieces of functionality in order to enable testing the "update path". 1. The runner will attempt to checkout the snapshot from the origin before running the test. This is to try and make sure that you are actually testing an update of the existing snapshot and not an intermediate snapshot (e.g. if you have been iterating locally). 2. When the runner performs the snapshot diff, it will check for any destructive changes and return that information as a warning to the user. 3. If any destructive changes are found, the runner will record that information as trace metadata in the `manifest.json` file. This will enable us to create automation that checks for this as part of the PR workflow Added logic to include the `allowDestroy` option in items 2 & 3 above. If a resource type is included in the `allowDestroy` list, then it will not be reported. ---- ### All Submissions: * [x] Have you followed the guidelines in our [Contributing guide?](https://github.com/aws/aws-cdk/blob/master/CONTRIBUTING.md) ### Adding new Unconventional Dependencies: * [ ] This PR adds new unconventional dependencies following the process described [here](https://github.com/aws/aws-cdk/blob/master/CONTRIBUTING.md/#adding-new-unconventional-dependencies) ### New Features * [ ] Have you added the new feature to an [integration test](https://github.com/aws/aws-cdk/blob/master/INTEGRATION_TESTS.md)? * [ ] Did you use `yarn integ` to deploy the infrastructure and generate the snapshot (i.e. `yarn integ` without `--dry-run`)? *By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license*
1 parent 423d72f commit d0ace8f

File tree

18 files changed

+1019
-258
lines changed

18 files changed

+1019
-258
lines changed

packages/@aws-cdk/integ-runner/README.md

+33
Original file line numberDiff line numberDiff line change
@@ -60,6 +60,8 @@ to be a self contained CDK app. The runner will execute the following for each f
6060
If this is set to `true` then the list of tests provided will be excluded
6161
- `--from-file`
6262
Read the list of tests from this file
63+
- `--disable-update-workflow` (default=`false`)
64+
If this is set to `true` then the [update workflow](#update-workflow) will be disabled
6365

6466
Example:
6567

@@ -150,6 +152,37 @@ Test Results:
150152
Tests: 1 passed, 1 total
151153
```
152154

155+
#### Update Workflow
156+
157+
By default, integration tests are run with the "update workflow" enabled. This can be disabled by using the `--disable-update-workflow` command line option.
158+
159+
If an existing snapshot is being updated, the integration test runner will first deploy the existing snapshot and then perform a stack update
160+
with the new changes. This is to test for cases where an update would cause a breaking change only on a stack update.
161+
162+
The `integ-runner` will also attempt to warn you if you are making any destructive changes with a message like:
163+
164+
```bash
165+
!!! This test contains destructive changes !!!
166+
Stack: aws-cdk-lambda-1 - Resource: MyLambdaServiceRole4539ECB6 - Impact: WILL_DESTROY
167+
Stack: aws-cdk-lambda-1 - Resource: AliasAliasPermissionAF30F9E8 - Impact: WILL_REPLACE
168+
Stack: aws-cdk-lambda-1 - Resource: AliasFunctionUrlDC6EC566 - Impact: WILL_REPLACE
169+
Stack: aws-cdk-lambda-1 - Resource: Aliasinvokefunctionurl4CA9917B - Impact: WILL_REPLACE
170+
!!! If these destructive changes are necessary, please indicate this on the PR !!!
171+
```
172+
173+
If the destructive changes are expected (and required) then please indicate this on your PR.
174+
175+
##### New tests
176+
177+
If you are adding a new test which creates a new snapshot then you should run that specific test with `--disable-update-workflow`.
178+
For example, if you are working on a new test `integ.new-test.js` then you would run:
179+
180+
```bash
181+
yarn integ --update-on-failed --disable-update-workflow integ.new-test.js
182+
```
183+
184+
This is because for a new test we do not need to test the update workflow (there is nothing to update).
185+
153186
### integ.json schema
154187

155188
See [@aws-cdk/cloud-assembly-schema/lib/integ-tests/schema.ts](../cloud-assembly-schema/lib/integ-tests/schema.ts)

packages/@aws-cdk/integ-runner/lib/cli.ts

+44-8
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,10 @@
11
// Exercise all integ stacks and if they deploy, update the expected synth files
22
import * as path from 'path';
3+
import * as chalk from 'chalk';
34
import * as workerpool from 'workerpool';
45
import * as logger from './logger';
56
import { IntegrationTests, IntegTestConfig } from './runner/integ-tests';
6-
import { runSnapshotTests, runIntegrationTests, IntegRunnerMetrics } from './workers';
7+
import { runSnapshotTests, runIntegrationTests, IntegRunnerMetrics, IntegTestWorkerConfig, DestructiveChange } from './workers';
78

89
// https://github.com/yargs/yargs/issues/1929
910
// https://github.com/evanw/esbuild/issues/1492
@@ -27,14 +28,16 @@ async function main() {
2728
.options('max-workers', { type: 'number', desc: 'The max number of workerpool workers to use when running integration tests in parallel', default: 16 })
2829
.options('exclude', { type: 'boolean', desc: 'All tests should be run, except for the list of tests provided', default: false })
2930
.options('from-file', { type: 'string', desc: 'Import tests to include or exclude from a file' })
31+
.option('disable-update-workflow', { type: 'boolean', default: 'false', desc: 'If this is "true" then the stack update workflow will be disabled' })
3032
.argv;
3133

3234
const pool = workerpool.pool(path.join(__dirname, '../lib/workers/extract/index.js'), {
3335
maxWorkers: argv['max-workers'],
3436
});
3537

3638
// list of integration tests that will be executed
37-
const testsToRun: IntegTestConfig[] = [];
39+
const testsToRun: IntegTestWorkerConfig[] = [];
40+
const destructiveChanges: DestructiveChange[] = [];
3841
const testsFromArgs: IntegTestConfig[] = [];
3942
const parallelRegions = arrayFromYargs(argv['parallel-regions']);
4043
const testRegions: string[] = parallelRegions ?? ['us-east-1', 'us-east-2', 'us-west-2'];
@@ -43,7 +46,7 @@ async function main() {
4346
const fromFile: string | undefined = argv['from-file'];
4447
const exclude: boolean = argv.exclude;
4548

46-
let failedSnapshots: IntegTestConfig[] = [];
49+
let failedSnapshots: IntegTestWorkerConfig[] = [];
4750
if (argv['max-workers'] < testRegions.length * (profiles ?? [1]).length) {
4851
logger.warning('You are attempting to run %s tests in parallel, but only have %s workers. Not all of your profiles+regions will be utilized', argv.profiles*argv['parallel-regions'], argv['max-workers']);
4952
}
@@ -65,14 +68,19 @@ async function main() {
6568
testsFromArgs.push(...(await new IntegrationTests(argv.directory).fromCliArgs(argv._.map((x: any) => x.toString()), exclude)));
6669
}
6770

68-
// If `--force` is not used then first validate the snapshots and gather
69-
// the failed snapshot tests. If `--force` is used then we will skip snapshot
70-
// tests and run integration tests for all tests
71+
// always run snapshot tests, but if '--force' is passed then
72+
// run integration tests on all failed tests, not just those that
73+
// failed snapshot tests
74+
failedSnapshots = await runSnapshotTests(pool, testsFromArgs);
75+
for (const failure of failedSnapshots) {
76+
destructiveChanges.push(...failure.destructiveChanges ?? []);
77+
}
7178
if (!argv.force) {
72-
failedSnapshots = await runSnapshotTests(pool, testsFromArgs);
7379
testsToRun.push(...failedSnapshots);
7480
} else {
75-
testsToRun.push(...testsFromArgs);
81+
// if any of the test failed snapshot tests, keep those results
82+
// and merge with the rest of the tests from args
83+
testsToRun.push(...mergeTests(testsFromArgs, failedSnapshots));
7684
}
7785

7886
// run integration tests if `--update-on-failed` OR `--force` is used
@@ -85,6 +93,7 @@ async function main() {
8593
clean: argv.clean,
8694
dryRun: argv['dry-run'],
8795
verbose: argv.verbose,
96+
updateWorkflow: !argv['disable-update-workflow'],
8897
});
8998
if (!success) {
9099
throw new Error('Some integration tests failed!');
@@ -101,13 +110,28 @@ async function main() {
101110
void pool.terminate();
102111
}
103112

113+
if (destructiveChanges.length > 0) {
114+
printDestructiveChanges(destructiveChanges);
115+
throw new Error('Some changes were destructive!');
116+
}
104117
if (failedSnapshots.length > 0) {
105118
let message = '';
106119
if (!runUpdateOnFailed) {
107120
message = 'To re-run failed tests run: yarn integ-runner --update-on-failed';
108121
}
109122
throw new Error(`Some snapshot tests failed!\n${message}`);
110123
}
124+
125+
}
126+
127+
function printDestructiveChanges(changes: DestructiveChange[]): void {
128+
if (changes.length > 0) {
129+
logger.warning('!!! This test contains %s !!!', chalk.bold('destructive changes'));
130+
changes.forEach(change => {
131+
logger.warning(' Stack: %s - Resource: %s - Impact: %s', change.stackName, change.logicalId, change.impact);
132+
});
133+
logger.warning('!!! If these destructive changes are necessary, please indicate this on the PR !!!');
134+
}
111135
}
112136

113137
function printMetrics(metrics: IntegRunnerMetrics[]): void {
@@ -134,6 +158,18 @@ function arrayFromYargs(xs: string[]): string[] | undefined {
134158
return xs.filter(x => x !== '');
135159
}
136160

161+
/**
162+
* Merge the tests we received from command line arguments with
163+
* tests that failed snapshot tests. The failed snapshot tests have additional
164+
* information that we want to keep so this should override any test from args
165+
*/
166+
function mergeTests(testFromArgs: IntegTestConfig[], failedSnapshotTests: IntegTestWorkerConfig[]): IntegTestWorkerConfig[] {
167+
const failedTestNames = new Set(failedSnapshotTests.map(test => test.fileName));
168+
const final: IntegTestWorkerConfig[] = failedSnapshotTests;
169+
final.push(...testFromArgs.filter(test => !failedTestNames.has(test.fileName)));
170+
return final;
171+
}
172+
137173
export function cli() {
138174
main().then().catch(err => {
139175
logger.error(err);

packages/@aws-cdk/integ-runner/lib/runner/private/cloud-assembly.ts

+53-4
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,19 @@ import * as path from 'path';
22
import { AssemblyManifest, Manifest, ArtifactType, AwsCloudFormationStackProperties, ArtifactManifest, MetadataEntry } from '@aws-cdk/cloud-assembly-schema';
33
import * as fs from 'fs-extra';
44

5+
/**
6+
* Trace information for stack
7+
* map of resource logicalId to trace message
8+
*/
9+
export type StackTrace = Map<string, string>;
10+
11+
/**
12+
* Trace information for a assembly
13+
*
14+
* map of stackId to StackTrace
15+
*/
16+
export type ManifestTrace = Map<string, StackTrace>;
17+
518
/**
619
* Reads a Cloud Assembly manifest
720
*/
@@ -64,6 +77,17 @@ export class AssemblyManifestReader {
6477
return stacks;
6578
}
6679

80+
/**
81+
* Write trace data to the assembly manifest metadata
82+
*/
83+
public recordTrace(trace: ManifestTrace): void {
84+
const newManifest = {
85+
...this.manifest,
86+
artifacts: this.renderArtifacts(trace),
87+
};
88+
Manifest.saveAssemblyManifest(newManifest, this.manifestFileName);
89+
}
90+
6791
/**
6892
* Clean the manifest of any unneccesary data. Currently that includes
6993
* the metadata trace information since this includes trace information like
@@ -77,26 +101,51 @@ export class AssemblyManifestReader {
77101
Manifest.saveAssemblyManifest(newManifest, this.manifestFileName);
78102
}
79103

80-
private renderArtifactMetadata(metadata: { [id: string]: MetadataEntry[] } | undefined): { [id: string]: MetadataEntry[] } {
104+
private renderArtifactMetadata(artifact: ArtifactManifest, trace?: StackTrace): { [id: string]: MetadataEntry[] } | undefined {
81105
const newMetadata: { [id: string]: MetadataEntry[] } = {};
82-
for (const [metadataId, metadataEntry] of Object.entries(metadata ?? {})) {
106+
if (!artifact.metadata) return artifact.metadata;
107+
for (const [metadataId, metadataEntry] of Object.entries(artifact.metadata ?? {})) {
83108
newMetadata[metadataId] = metadataEntry.map((meta: MetadataEntry) => {
109+
if (meta.type === 'aws:cdk:logicalId' && trace && meta.data) {
110+
const traceData = trace.get(meta.data.toString());
111+
if (traceData) {
112+
trace.delete(meta.data.toString());
113+
return {
114+
type: meta.type,
115+
data: meta.data,
116+
trace: [traceData],
117+
};
118+
}
119+
}
84120
// return metadata without the trace data
85121
return {
86122
type: meta.type,
87123
data: meta.data,
88124
};
89125
});
90126
}
127+
if (trace && trace.size > 0) {
128+
for (const [id, data] of trace.entries()) {
129+
newMetadata[id] = [{
130+
type: 'aws:cdk:logicalId',
131+
data: id,
132+
trace: [data],
133+
}];
134+
}
135+
}
91136
return newMetadata;
92137
}
93138

94-
private renderArtifacts(): { [id: string]: ArtifactManifest } | undefined {
139+
private renderArtifacts(trace?: ManifestTrace): { [id: string]: ArtifactManifest } | undefined {
95140
const newArtifacts: { [id: string]: ArtifactManifest } = {};
96141
for (const [artifactId, artifact] of Object.entries(this.manifest.artifacts ?? {})) {
142+
let stackTrace: StackTrace | undefined = undefined;
143+
if (artifact.type === ArtifactType.AWS_CLOUDFORMATION_STACK && trace) {
144+
stackTrace = trace.get(artifactId);
145+
}
97146
newArtifacts[artifactId] = {
98147
...artifact,
99-
metadata: this.renderArtifactMetadata(artifact.metadata),
148+
metadata: this.renderArtifactMetadata(artifact, stackTrace),
100149
};
101150
}
102151
return newArtifacts;
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
// Helper functions for CDK Exec
2+
import { spawnSync } from 'child_process';
3+
4+
/**
5+
* Our own execute function which doesn't use shells and strings.
6+
*/
7+
export function exec(commandLine: string[], options: { cwd?: string, verbose?: boolean, env?: any } = { }): any {
8+
const proc = spawnSync(commandLine[0], commandLine.slice(1), {
9+
stdio: ['ignore', 'pipe', options.verbose ? 'inherit' : 'pipe'], // inherit STDERR in verbose mode
10+
env: {
11+
...process.env,
12+
...options.env,
13+
},
14+
cwd: options.cwd,
15+
});
16+
17+
if (proc.error) { throw proc.error; }
18+
if (proc.status !== 0) {
19+
if (process.stderr) { // will be 'null' in verbose mode
20+
process.stderr.write(proc.stderr);
21+
}
22+
throw new Error(`Command exited with ${proc.status ? `status ${proc.status}` : `signal ${proc.signal}`}`);
23+
}
24+
25+
const output = proc.stdout.toString('utf-8').trim();
26+
27+
return output;
28+
}

0 commit comments

Comments
 (0)