ci: use git submodules for (securely) using third party Github Actions (#12709)

* Use git submodules for (securely) using third party Github Actions

List of repositories added as submodules:

EndBug/latest-tag@latest
morrisoncole/pr-lint-action@v1.4.1
trilom/file-changes-action@v1.2.4
styfle/cancel-workflow-action@0.6.0
apache-superset/cached-dependencies@b90713b
unsplash/comment-on-pr@v1.2.0
This commit is contained in:
Tobiasz Kędzierski 2021-01-24 08:10:16 +01:00 committed by GitHub
parent ef839f674d
commit 1f27b62d51
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
196 changed files with 131 additions and 62488 deletions

@ -0,0 +1 @@
Subproject commit b90713be305978a582ff222db84f03262fce5416

View File

@ -1 +0,0 @@
indent_size = 2

View File

@ -1,3 +0,0 @@
dist/
lib/
node_modules/

View File

@ -1,26 +0,0 @@
module.exports = {
plugins: ['jest', '@typescript-eslint'],
extends: ['plugin:jest/all'],
parser: '@typescript-eslint/parser',
parserOptions: {
ecmaVersion: 9,
sourceType: 'module',
},
rules: {
'eslint-comments/no-use': 'off',
'import/no-namespace': 'off',
'no-unused-vars': 'off',
'no-console': 'off',
'jest/prefer-expect-assertions': 'off',
'jest/no-disabled-tests': 'warn',
'jest/no-focused-tests': 'error',
'jest/no-identical-title': 'error',
'jest/prefer-to-have-length': 'warn',
'jest/valid-expect': 'error',
},
env: {
node: true,
es6: true,
'jest/globals': true,
},
};

View File

@ -1,34 +0,0 @@
name: Tests
on:
pull_request:
paths-ignore:
- '**.md'
push:
branches:
- master
paths-ignore:
- '**.md'
jobs:
test:
strategy:
matrix:
os: [ubuntu-latest, macOS-latest]
name: Test on ${{ matrix.os }}
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v1
- uses: actions/setup-node@v1
with:
node-version: '12.x'
- name: Install dependencies
run: npm ci
- name: Run prettier format check
run: npm run format-check
- name: Build
run: npm run build
- name: Run tests
run: npm run test
- name: Upload code coverage
run: |
bash <(curl -s https://codecov.io/bash)

View File

@ -1,6 +0,0 @@
lib
coverage
node_modules
!dist
!dist/cache

View File

@ -1,3 +0,0 @@
dist/
lib/
node_modules/

View File

@ -1,11 +0,0 @@
{
"printWidth": 80,
"tabWidth": 2,
"useTabs": false,
"semi": true,
"singleQuote": true,
"trailingComma": "all",
"bracketSpacing": true,
"arrowParens": "avoid",
"parser": "typescript"
}

View File

@ -1,22 +0,0 @@
The MIT License (MIT)
Copyright (c) 2018 GitHub, Inc. and contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@ -1,212 +0,0 @@
# cached-dependencies
[![](https://github.com/ktmud/cached-dependencies/workflows/Tests/badge.svg)](https://github.com/ktmud/cached-dependencies/actions?query=workflow%3ATests) [![codecov](https://codecov.io/gh/ktmud/cached-dependencies/branch/master/graph/badge.svg)](https://codecov.io/gh/ktmud/cached-dependencies)
Enable **multi-layer cache** and **shortcut commands** in any workflows.
Manage multiple cache targets in one step. Use either the built-in cache configs for npm, yarn, and pip, or write your own. Create a bash command library to easily reduce redudencies across workflows. Most useful for building webapps that require multi-stage building processes.
This is your all-in-one action for everything related to setting up dependencies with cache.
## Inputs
- **run**: bash commands to run, allows shortcut commands
- **caches**: path to a JS module that defines cache targets, defaults to `.github/workflows/caches.js`
- **bashlib**: path to a BASH scripts that defines shortcut commands, defaults to `.github/workflows/bashlib.sh`
- **parallel**: whether to run the commands in parallel with node subprocesses
## Examples
Following workflow sets up dependencies for a typical Python web app with both `~/.pip` and `~/.npm` cache configured in one simple step:
```yaml
jobs:
build_and_test:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Install dependencies
uses: ktmud/cached-dependencies@v1
with:
run: |
npm-install
npm run build
pip-install
python ./bin/manager.py fill_test_data
```
Here we used predefined `npm-install` and `pip-install` commands to install dependencies with correponding caches.
You may also replace `npm-install` with `yarn-install` to install npm pacakges with `yarn.lock`.
```yaml
- name: Install dependencies
uses: ktmud/cached-dependencies@v1
with:
run: |
yarn-install
yarn build
pip-install
python ./bin/manager.py fill_test_data
```
See below for more details.
## Usage
### Cache configs
Under the hood, we use [@actions/cache](https://github.com/marketplace/actions/cache) to manage cache storage. But instead of defining only one cache at a time and specify them in workflow YAMLs, you manage all caches in a spearate JS file: `.github/workflows/caches.js`.
Here is [the default configuration](https://github.com/ktmud/cached-dependencies/blob/master/src/cache/caches.ts) for Linux:
```js
module.exports = {
pip: {
path: [`${process.env.HOME}/.cache/pip`],
hashFiles: ['requirements*.txt'],
keyPrefix: 'pip-',
restoreKeys: 'pip-',
},
npm: {
path: [`${HOME}/.npm`],
hashFiles: [
`package-lock.json`,
`*/*/package-lock.json`,
`!node_modules/*/package-lock.json`,
],
},
yarn: {
path: [`${HOME}/.npm`],
// */* is for supporting lerna monorepo with depth=2
hashFiles: [`yarn.lock`, `*/*/yarn.lock`, `!node_modules/*/yarn.lock`],
},
}
```
In which `hashFiles` and `keyPrefix` will be used to compute the primary cache key used in [@actions/cache](https://github.com/marketplace/actions/cache). `keyPrefix` will default to `${cacheName}-` and `restoreKeys` will default to `keyPrefix` if not specified.
It is recommended to always use absolute paths in these configs so you can share them across different worflows more easily (in case you the action is called from different working directories).
#### Speficy when to restore and save
With the predefined `cache-store` and `cache-save` bash commands, you have full flexibility on when to restore and save cache:
```yaml
steps:
- uses: actions/checkout@v2
- uses: ktmud/cached-dependencies@v1
with:
run: |
cache-restore npm
npm install
cache-save npm
cache-restore pip
pip install -r requirements.txt
cache-save pip
```
### Shortcut commands
All predefined shortcut commands can be found [here](https://github.com/ktmud/cached-dependencies/blob/master/src/scripts/bashlib.sh). You can also customize them or add new ones in `.github/workflows/bashlib.sh`.
For example, if you want to install additional packages for before saving `pip` cache, simply add this to the `bashlib.sh` file:
```bash
# override the default `pip-install` command
pip-install() {
cd $GITHUB_WORKSPACE
cache-restore pip
echo "::group::pip install"
pip install -r requirements.txt # prod requirements
pip install -r requirements-dev.txt # dev requirements
pip install -e ".[postgres,mysql]" # current pacakge with some extras
echo "::endgroup::"
cache-save pip
}
```
### Default setup command
When `run` is not provided:
```yaml
jobs:
name: Build
steps:
- name: Install dependencies
uses: ktmud/cached-depdencies@v1
```
You must provide a `default-setup-command` in the bashlib. For example,
```bash
default-setup-command() {
pip-install & npm-install
}
```
This will start installing pip and npm dependencies at the same time.
### Customize config locations
Both the two config files, `.github/workflows/bashlib.sh` and `.github/workflows/caches.js`, can be placed in other locations:
```yaml
- uses: ktmud/cached-dependencies@v1
with:
caches: ${{ github.workspace }}/.github/configs/caches.js
bashlib: ${{ github.workspace }}/.github/configs/bashlib.sh
```
### Run commands in parallel
When `parallel` is set to `true`, the `run` input will be split into an array of commands and passed to `Promise.all(...)` to execute in parallel. For example,
```yaml
- uses: ktmud/cached-dependencies@v1
with:
parallel: true
run: |
pip-install
npm-install
```
is equivalent to
```yaml
- uses: ktmud/cached-dependencies@v1
with:
run: |
pip-install & npm-install
```
If one or more of your commands must spread across multiple lines, you can add a new line between the parallel commands. Each command within a parallel group will still run sequentially.
```yaml
- uses: ktmud/cached-dependencies@v1
with:
run: |
cache-restore pip
pip install requirements*.txt
# additional pip packages
pip install package1 package2 pacakge2
cache-save pip
npm-install
cache-restore cypress
cd cypress/ && npm install
cache-save cypress
```
## License
This project is released under [the MIT License](LICENSE).

View File

@ -1,124 +0,0 @@
import path from 'path';
import * as fs from 'fs';
import * as os from 'os';
import * as core from '@actions/core';
import * as cache from '../src/cache';
import * as inputsUtils from '../src/utils/inputs';
import * as actionUtils from '@actions/cache/src/utils/actionUtils';
import defaultCaches from '../src/cache/caches';
import { setInputs, getInput, maybeArrayToString } from '../src/utils/inputs';
import { Inputs, InputName, GitHubEvent, EnvVariable } from '../src/constants';
import caches, { npmExpectedHash } from './fixtures/caches';
describe('patch core states', () => {
it('should log error if states file invalid', () => {
const logWarningMock = jest.spyOn(actionUtils, 'logWarning');
fs.writeFileSync(`${os.tmpdir()}/cached--states.json`, 'INVALID_JSON', {
encoding: 'utf-8',
});
core.getState('haha');
expect(logWarningMock).toHaveBeenCalledTimes(2);
});
it('should persist state', () => {
core.saveState('test', '100');
expect(core.getState('test')).toStrictEqual('100');
});
});
describe('cache runner', () => {
it('should use default cache config', async () => {
await cache.loadCustomCacheConfigs();
// but `npm` actually come from `src/cache/caches.ts`
const inputs = await cache.getCacheInputs('npm');
expect(inputs?.[InputName.Path]).toStrictEqual(
maybeArrayToString(defaultCaches.npm.path),
);
expect(inputs?.[InputName.RestoreKeys]).toStrictEqual('npm-');
});
it('should override cache config', async () => {
setInputs({
[InputName.Caches]: path.resolve(__dirname, 'fixtures/caches'),
});
await cache.loadCustomCacheConfigs();
const inputs = await cache.getCacheInputs('npm');
expect(inputs?.[InputName.Path]).toStrictEqual(
maybeArrayToString(caches.npm.path),
);
expect(inputs?.[InputName.Key]).toStrictEqual(`npm-${npmExpectedHash}`);
expect(inputs?.[InputName.RestoreKeys]).toStrictEqual(
maybeArrayToString(caches.npm.restoreKeys),
);
});
it('should apply inputs and restore cache', async () => {
setInputs({
[InputName.Caches]: path.resolve(__dirname, 'fixtures/caches'),
[EnvVariable.GitHubEventName]: GitHubEvent.PullRequest,
});
const setInputsMock = jest.spyOn(inputsUtils, 'setInputs');
const inputs = await cache.getCacheInputs('npm');
const result = await cache.run('restore', 'npm');
expect(result).toBeUndefined();
// before run
expect(setInputsMock).toHaveBeenNthCalledWith(1, inputs);
// after run
expect(setInputsMock).toHaveBeenNthCalledWith(2, {
[InputName.Key]: '',
[InputName.Path]: '',
[InputName.RestoreKeys]: '',
});
// inputs actually restored to original value
expect(getInput(InputName.Key)).toStrictEqual('');
// pretend still in execution context
setInputs(inputs as Inputs);
// `core.getState` should return the primary key
expect(core.getState('CACHE_KEY')).toStrictEqual(inputs?.[InputName.Key]);
setInputsMock.mockRestore();
});
it('should run saveCache', async () => {
// call to save should also work
const logWarningMock = jest.spyOn(actionUtils, 'logWarning');
setInputs({
[InputName.Parallel]: 'true',
});
await cache.run('save', 'npm');
expect(logWarningMock).toHaveBeenCalledWith(
'Cache Service Url not found, unable to restore cache.',
);
});
it('should exit on invalid args', async () => {
// other calls do generate errors
const processExitMock = jest
.spyOn(process, 'exit')
// @ts-ignore
.mockImplementation(() => {});
// incomplete arguments
await cache.run();
await cache.run('save');
// bad arguments
await cache.run('save', 'unknown-cache');
await cache.run('unknown-action', 'unknown-cache');
setInputs({
[InputName.Caches]: 'non-existent',
});
await cache.run('save', 'npm');
expect(processExitMock).toHaveBeenCalledTimes(5);
});
});

View File

@ -1,5 +0,0 @@
#!/bin/bash
default-setup-command() {
print-cachescript-path
}

View File

@ -1,14 +0,0 @@
/**
* Example cache config.
*/
export const npmHashFiles = ['.*ignore'];
export const npmExpectedHash =
'13ed29a1c7ec906e7dcb20626957ebfcd3f0f2174bd2685a012105792bf1ff55';
export default {
npm: {
path: [`~/.npm`],
hashFiles: npmHashFiles,
restoreKeys: 'node-npm-',
},
};

View File

@ -1,101 +0,0 @@
/**
* Test default runner.
*/
import { setInputs } from '../src/utils/inputs';
import { InputName, DefaultInputs } from '../src/constants';
import * as setup from '../src/setup';
import path from 'path';
const extraBashlib = path.resolve(__dirname, './fixtures/bashlib.sh');
describe('setup runner', () => {
// don't actually run the bash script
const runCommandMock = jest.spyOn(setup, 'runCommand');
it('should allow custom bashlib', async () => {
setInputs({
[InputName.Bashlib]: extraBashlib,
});
await setup.run();
expect(runCommandMock).toHaveBeenCalledTimes(1);
expect(runCommandMock).toHaveBeenCalledWith(
DefaultInputs[InputName.Run],
extraBashlib,
);
});
it('should allow inline bash overrides', async () => {
const processExitMock = jest
.spyOn(process, 'exit')
// @ts-ignore
.mockImplementation(() => {});
setInputs({
[InputName.Bashlib]: '',
[InputName.Parallel]: 'false',
[InputName.Run]: `
${DefaultInputs[InputName.Run]}() {
echo "It works!"
exit 202
}
${DefaultInputs[InputName.Run]}
`,
});
// allow the bash script to run for one test, but override the default
await setup.run();
expect(runCommandMock).toHaveBeenCalledTimes(1);
expect(processExitMock).toHaveBeenCalledTimes(1);
expect(processExitMock).toHaveBeenCalledWith(1);
});
it('should use run commands', async () => {
// don't run the commands when there is no overrides
runCommandMock.mockImplementation(async () => {});
setInputs({
[InputName.Bashlib]: 'non-existent',
[InputName.Run]: 'print-cachescript-path',
});
await setup.run();
expect(runCommandMock).toHaveBeenCalledTimes(1);
expect(runCommandMock).toHaveBeenCalledWith('print-cachescript-path', '');
});
it('should handle single-new-line parallel commands', async () => {
setInputs({
[InputName.Run]: `
test-command-1
test-command-2
`,
[InputName.Parallel]: 'true',
});
await setup.run();
expect(runCommandMock).toHaveBeenNthCalledWith(1, 'test-command-1', '');
expect(runCommandMock).toHaveBeenNthCalledWith(2, 'test-command-2', '');
});
it('should handle multi-new-line parallel commands', async () => {
setInputs({
[InputName.Run]: `
test-1-1
test-1-2
test-2
`,
[InputName.Parallel]: 'true',
});
await setup.run();
expect(runCommandMock).toHaveBeenNthCalledWith(
1,
'test-1-1\n test-1-2',
'',
);
expect(runCommandMock).toHaveBeenNthCalledWith(2, 'test-2', '');
});
});

View File

@ -1,10 +0,0 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"baseUrl": "./",
"outDir": "../build",
"noEmit": true,
"rootDir": "../"
},
"exclude": ["node_modules"]
}

View File

@ -1,25 +0,0 @@
name: Cached Dependencies
description: Setup multi-layered cache and dependencies in one step, share predefined commands across workflows
author: Jesse Yang <hello@yjc.me>
branding:
icon: layers
color: yellow
inputs:
caches:
required: false
description: Path to a JS file with cache configs
default: ${{ github.workspace }}/.github/workflows/caches.js
bashlib:
required: false
description: Path to a Bash script with command shortcuts
default: ${{ github.workspace }}/.github/workflows/bashlib.sh
run:
required: false
description: Setup commands to run, can use shortcuts defined in bashlib
default: default-setup-command
parallel:
required: false
description: Whether to run commands in parallel
runs:
using: node12
main: dist/index.js

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,57 +0,0 @@
'use strict';
const fs = require('fs');
const crypto = require('crypto');
const {parentPort} = require('worker_threads');
const handlers = {
hashFile: (algorithm, filePath) => new Promise((resolve, reject) => {
const hasher = crypto.createHash(algorithm);
fs.createReadStream(filePath)
// TODO: Use `Stream.pipeline` when targeting Node.js 12.
.on('error', reject)
.pipe(hasher)
.on('error', reject)
.on('finish', () => {
const {buffer} = hasher.read();
resolve({value: buffer, transferList: [buffer]});
});
}),
hash: async (algorithm, input) => {
const hasher = crypto.createHash(algorithm);
if (Array.isArray(input)) {
for (const part of input) {
hasher.update(part);
}
} else {
hasher.update(input);
}
const hash = hasher.digest().buffer;
return {value: hash, transferList: [hash]};
}
};
parentPort.on('message', async message => {
try {
const {method, args} = message;
const handler = handlers[method];
if (handler === undefined) {
throw new Error(`Unknown method '${method}'`);
}
const {value, transferList} = await handler(...args);
parentPort.postMessage({id: message.id, value}, transferList);
} catch (error) {
const newError = {message: error.message, stack: error.stack};
for (const [key, value] of Object.entries(error)) {
if (typeof value !== 'object') {
newError[key] = value;
}
}
parentPort.postMessage({id: message.id, error: newError});
}
});

View File

@ -1,21 +0,0 @@
module.exports = {
clearMocks: true,
moduleFileExtensions: ['js', 'ts'],
testEnvironment: 'node',
testMatch: ['**/*.test.ts'],
transform: {
'^.+\\.ts$': 'ts-jest',
},
transformIgnorePatterns: [
'/node_modules/(?!@actions).+\\.js$',
],
verbose: true,
};
// suppress debug messages
const processStdoutWrite = process.stdout.write.bind(process.stdout);
process.stdout.write = (str, encoding, cb) => {
processStdoutWrite(str.split('\n').filter(x => {
return !/^::debug::/.test(x);
}).join('\n'), encoding, cb);
};

File diff suppressed because it is too large Load Diff

View File

@ -1,47 +0,0 @@
{
"name": "setup-superset-action",
"version": "1.0.0",
"private": true,
"keywords": [
"actions",
"node",
"setup",
"superset"
],
"main": "dist/run",
"scripts": {
"all": "npm run format && npm run lint && npm run test && npm run build",
"build": "npm run clean && tsc && ncc build -o dist src/run.ts && ncc build -o dist/scripts/cache src/scripts/cache.ts",
"clean": "rm -rf ./lib ./dist",
"coverage": "npm run test && open ./coverage/lcov-report/index.html",
"format": "prettier --write **/*.ts",
"format-check": "prettier --check **/*.ts",
"lint": "eslint src/**/*.ts",
"test": "jest --clearCache && jest --coverage"
},
"dependencies": {
"@actions/cache": "actions/cache#d29c1df198dd38ac88e0ae23a2881b99c2d20e68",
"@actions/core": "1.2.4",
"@actions/exec": "1.0.4",
"@actions/glob": "0.1.0",
"@types/uuid": "7.0.4",
"hasha": "5.2.0",
"tempy": "0.6.0",
"uuid": "7.0.3"
},
"devDependencies": {
"@types/jest": "26.0.7",
"@types/node": "12.12.53",
"@typescript-eslint/eslint-plugin": "3.7.1",
"@typescript-eslint/parser": "3.7.1",
"@zeit/ncc": "0.22.3",
"eslint": "7.5.0",
"eslint-plugin-jest": "23.19.0",
"jest": "26.1.0",
"js-yaml": "3.14.0",
"prettier": "2.0.5",
"prettier-plugin-packagejson": "2.2.5",
"ts-jest": "26.1.4",
"typescript": "3.9.7"
}
}

View File

@ -1,5 +0,0 @@
{
"extends": [
"config:base"
]
}

View File

@ -1,49 +0,0 @@
/**
* Default cache configs
*/
import * as os from 'os';
export interface CacheConfig {
path: string[] | string;
hashFiles: string[] | string;
keyPrefix?: string;
restoreKeys?: string[] | string;
}
export interface CacheConfigs {
[cacheName: string]: CacheConfig;
}
const { HOME = '~' } = process.env;
const platform = os.platform() as 'linux' | 'darwin' | 'win32';
const pathByPlatform = {
linux: {
pip: `${HOME}/.cache/pip`,
},
darwin: {
pip: `${HOME}/Library/Caches/pip`,
},
win32: {
pip: `${HOME}\\AppData\\Local\\pip\\Cache`,
},
};
export default {
pip: {
path: pathByPlatform[platform].pip,
hashFiles: 'requirements*.txt',
},
npm: {
path: `${HOME}/.npm`,
hashFiles: [
`package-lock.json`,
// support lerna monorepo with depth=2
`*/*/package-lock.json`,
`!node_modules/*/package-lock.json`,
],
},
yarn: {
path: `${HOME}/.npm`,
hashFiles: [`yarn.lock`, `*/*/yarn.lock`, `!node_modules/*/yarn.lock`],
},
} as CacheConfigs;

View File

@ -1,146 +0,0 @@
/**
* Execute @actions/cache with predefined cache configs.
*/
import { beginImport, doneImport } from './patch'; // monkey patch @actions modules
beginImport();
import saveCache from '@actions/cache/src/save';
import restoreCache from '@actions/cache/src/restore';
doneImport();
import hasha from 'hasha';
import * as fs from 'fs';
import * as core from '@actions/core';
import * as glob from '@actions/glob';
import { Inputs, InputName, DefaultInputs } from '../constants';
import { applyInputs, getInput, maybeArrayToString } from '../utils/inputs';
import caches from './caches'; // default cache configs
// GitHub uses `sha256` for the built-in `${{ hashFiles(...) }}` expression
// https://help.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#hashfiles
const HASH_OPTION = { algorithm: 'sha256' };
/**
* Load custom cache configs from the `caches` path defined in inputs.
*
* @returns Whether the loading is successfull.
*/
export async function loadCustomCacheConfigs() {
const customCachePath = getInput(InputName.Caches);
try {
core.debug(`Reading cache configs from '${customCachePath}'`);
const customCache = await import(customCachePath);
Object.assign(caches, customCache.default);
} catch (error) {
if (
customCachePath !== DefaultInputs[InputName.Caches] ||
!error.message.includes('Cannot find module')
) {
core.error(error.message);
core.setFailed(
`Failed to load custom cache configs: '${customCachePath}'`,
);
return process.exit(1);
}
}
return true;
}
/**
* Generate SHA256 hash for a list of files matched by glob patterns.
*
* @param {string[]} patterns - The glob pattern.
* @param {string} extra - The extra string to append to the file hashes to
* comptue the final hash.
*/
export async function hashFiles(
patterns: string[] | string,
extra: string = '',
) {
const globber = await glob.create(maybeArrayToString(patterns));
let hash = '';
let counter = 0;
for await (const file of globber.globGenerator()) {
if (!fs.statSync(file).isDirectory()) {
hash += hasha.fromFileSync(file, HASH_OPTION);
counter += 1;
}
}
core.debug(`Computed hash for ${counter} files. Pattern: ${patterns}`);
return hasha(hash + extra, HASH_OPTION);
}
/**
* Generate GitHub Action inputs based on predefined cache config. Will be used
* to override env variables.
*
* @param {string} cacheName - Name of the predefined cache config.
*/
export async function getCacheInputs(
cacheName: string,
): Promise<Inputs | null> {
if (!(cacheName in caches)) {
return null;
}
const { keyPrefix, restoreKeys, path, hashFiles: patterns } = caches[
cacheName
];
const pathString = maybeArrayToString(path);
const prefix = keyPrefix || `${cacheName}-`;
// include `path` to hash, too, so to burse caches in case users change
// the path definition.
const hash = await hashFiles(patterns, pathString);
return {
[InputName.Key]: `${prefix}${hash}`,
[InputName.Path]: pathString,
// only use prefix as restore key if it is never defined
[InputName.RestoreKeys]:
restoreKeys === undefined ? prefix : maybeArrayToString(restoreKeys),
};
}
export const actions = {
restore(inputs: Inputs) {
return applyInputs(inputs, restoreCache);
},
save(inputs: Inputs) {
return applyInputs(inputs, saveCache);
},
};
export type ActionChoice = keyof typeof actions;
export async function run(
action: string | undefined = undefined,
cacheName: string | undefined = undefined,
) {
if (!action || !(action in actions)) {
core.setFailed(`Choose a cache action from: [restore, save]`);
return process.exit(1);
}
if (!cacheName) {
core.setFailed(`Must provide a cache name.`);
return process.exit(1);
}
const runInParallel = getInput(InputName.Parallel);
if (await loadCustomCacheConfigs()) {
if (runInParallel) {
core.info(`${action.toUpperCase()} cache for ${cacheName}`);
} else {
core.startGroup(`${action.toUpperCase()} cache for ${cacheName}`);
}
const inputs = await getCacheInputs(cacheName);
if (inputs) {
core.info(JSON.stringify(inputs, null, 2));
await actions[action as ActionChoice](inputs);
} else {
core.setFailed(`Cache '${cacheName}' not defined, failed to ${action}.`);
return process.exit(1);
}
if (!runInParallel) {
core.endGroup();
}
}
}

View File

@ -1,95 +0,0 @@
/**
* Monkey patch to safely import and use @action/cache modules
*/
import * as utils from '@actions/cache/src/utils/actionUtils';
import * as core from '@actions/core';
import * as fs from 'fs';
import * as os from 'os';
import { InputName } from '../constants';
import { getInput } from '../utils/inputs';
interface KeyValueStore {
[key: string]: any;
}
const { logWarning, isValidEvent } = utils;
const { getState, saveState } = core;
function getStateStoreFile() {
const cacheName = getInput(InputName.Key);
return `${os.tmpdir()}/cached-${cacheName}-states.json`;
}
/**
* Load states from the persistent store.
*
* The default `core.saveState` only writes states as command output, and
* `core.getState` is only possible to read the state in a later step via ENV
* variables.
*
* So we use a temp file to save and load states, so to allow persistent
* states within the same step.
*
* Since the state output is not uniq to caches, each cache should have their
* own file for persistent states.
*/
function loadStates() {
const stateStore = getStateStoreFile();
const states: KeyValueStore = {};
try {
Object.assign(
states,
JSON.parse(fs.readFileSync(stateStore, { encoding: 'utf-8' })),
);
core.debug(`Loaded states from: ${stateStore}`)
} catch (error) {
// pass
if (error.code !== 'ENOENT') {
utils.logWarning(`Could not load states: ${stateStore}`)
utils.logWarning(error.message);
}
}
return states;
}
/**
* Save states to the persistent storage.
*/
function persistState(name: string, value: any) {
const states = loadStates();
const stateStore = getStateStoreFile();
const valueString = typeof value === 'string' ? value : JSON.stringify(value);
// make sure value is always string
states[name] = valueString;
// persist state in the temp file
fs.writeFileSync(stateStore, JSON.stringify(states, null, 2), {
encoding: 'utf-8',
});
core.debug(`Persist state "${name}=${valueString}" to ${stateStore}`);
// still pass the original value to the original function, though
return saveState(name, value);
}
/**
* Get states from persistent store, fallback to "official" states.
*/
function obtainState(name: string) {
const states = loadStates();
return states[name] || getState(name);
}
export function beginImport() {
Object.defineProperty(utils, 'isValidEvent', { value: () => false });
Object.defineProperty(utils, 'logWarning', { value: () => {} });
}
export function doneImport() {
Object.defineProperty(utils, 'isValidEvent', { value: isValidEvent });
Object.defineProperty(utils, 'logWarning', { value: logWarning });
Object.defineProperty(core, 'saveState', { value: persistState });
Object.defineProperty(core, 'getState', { value: obtainState });
}

View File

@ -1,43 +0,0 @@
// Possible input names
export enum InputName {
// @actions/cache specific inputs
Key = 'key',
Path = 'path',
RestoreKeys = 'restore-keys',
// setup-webapp specific inputs
Run = 'run',
Caches = 'caches',
Bashlib = 'bashlib',
Parallel = 'parallel',
}
// Possible GitHub event names
export enum GitHubEvent {
Push = 'push',
PullRequest = 'pull_request',
}
// Directly available environment variables
export enum EnvVariable {
GitHubEventName = 'GITHUB_EVENT_NAME',
}
export const EnvVariableNames = new Set(Object.values(EnvVariable) as string[]);
export interface Inputs {
[EnvVariable.GitHubEventName]?: string;
[InputName.Key]?: string;
[InputName.RestoreKeys]?: string;
[InputName.Path]?: string;
[InputName.Caches]?: string;
[InputName.Bashlib]?: string;
[InputName.Run]?: string;
[InputName.Parallel]?: string;
}
export const DefaultInputs = {
[InputName.Caches]: '.github/workflows/caches.js',
[InputName.Bashlib]: '.github/workflows/bashlib.sh',
[InputName.Run]: 'default-setup-command',
} as Inputs;

View File

@ -1,3 +0,0 @@
import { run } from './setup';
run();

View File

@ -1,61 +0,0 @@
#!/bin/bash
# -----------------------------------------------
# Predefined command shortcuts
# -----------------------------------------------
# Exit on any command fails
set -e
bashSource=${BASH_SOURCE[${#BASH_SOURCE[@]} - 1]:-${(%):-%x}}
cacheScript="$(dirname $(dirname $(dirname $bashSource)))/dist/scripts/cache"
print-cachescript-path() {
echo $cacheScript
}
cache-restore() {
node $cacheScript restore $1
}
cache-save() {
node $cacheScript save $1
}
# install python packages
pip-install() {
cache-restore pip
echo "::group::Install Python pacakges"
pip install -r requirements.txt # install dependencies
pip install -e . # install current directory as editable python package
echo "::endgroup"
cache-save pip
}
# install npm packages
npm-install() {
cache-restore npm
echo "::group::Install npm pacakges"
echo "npm: $(npm --version)"
echo "node: $(node --version)"
npm ci
echo "::endgroup::"
cache-save npm
}
# install npm packages via yarn
yarn-install() {
cache-restore yarn
echo "::group::Install npm pacakges via yarn"
echo "npm: $(npm --version)"
echo "node: $(node --version)"
echo "yarn: $(yarn --version)"
yarn
echo "::endgroup::"
cache-save yarn
}
# default setup will install both pip and npm pacakges at the same time
default-setup-command() {
echo 'Please provide `run` commands or configure `default-setup-command`.'
exit 1
}

View File

@ -1,18 +0,0 @@
/**
* Runner script to store/save caches by predefined configs.
* Used in `scripts/bashlib.sh`.
*/
import { EnvVariable } from '../constants';
// To import `@actions/cache` modules safely, we must set GitHub event name to
// a invalid value, so actual runner code doesn't execute.
const originalEvent = process.env[EnvVariable.GitHubEventName];
process.env[EnvVariable.GitHubEventName] = 'CACHE_HACK';
import { run } from '../cache';
// then we restore the event name before the job actually runs
process.env[EnvVariable.GitHubEventName] = originalEvent;
// @ts-ignore
run(...process.argv.slice(2));

View File

@ -1,66 +0,0 @@
/**
* Load inputs and execute.
*/
import * as core from '@actions/core';
import { exec } from '@actions/exec';
import path from 'path';
import fs from 'fs';
import { DefaultInputs, InputName } from './constants';
import { getInput } from './utils/inputs';
const SHARED_BASHLIB = path.resolve(__dirname, '../src/scripts/bashlib.sh');
/**
* Run bash commands with predefined lib functions.
*
* @param {string} cmd - The bash commands to execute.
*/
export async function runCommand(
cmd: string,
extraBashlib: string,
): Promise<void> {
const bashlibCommands = [`source ${SHARED_BASHLIB}`];
if (extraBashlib) {
bashlibCommands.push(`source ${extraBashlib}`);
}
try {
await exec('bash', ['-c', [...bashlibCommands, cmd].join('\n ')]);
} catch (error) {
core.setFailed(error.message);
process.exit(1);
}
}
export async function run(): Promise<void> {
let bashlib = getInput(InputName.Bashlib);
const rawCommands = getInput(InputName.Run);
const runInParallel = getInput(InputName.Parallel);
if (!fs.existsSync(bashlib)) {
if (bashlib !== DefaultInputs[InputName.Bashlib]) {
core.error(`Custom bashlib "${bashlib}" does not exist.`);
}
// don't add bashlib to runCommand
bashlib = '';
}
if (runInParallel) {
// Attempt to split by two or more new lines first, if there is still only
// one command, attempt to split by one new line. This is because users
// asked for parallelization, so we make our best efforts to get multiple
// commands.
let commands = rawCommands.split(/\n{2,}/);
if (commands.length === 1) {
commands = rawCommands.split('\n');
}
core.debug(`>> Run ${commands.length} commands in parallel...`);
await Promise.all(
commands
.map(x => x.trim())
.filter(x => !!x)
.map(cmd => exports.runCommand(cmd, bashlib)),
);
} else if (rawCommands) {
await exports.runCommand(rawCommands, bashlib);
}
}

View File

@ -1,2 +0,0 @@
declare module '@actions/cache/dist/restore';
declare module '@actions/cache/dist/save';

View File

@ -1,61 +0,0 @@
/**
* Manage inputs and env variables.
*/
import * as core from '@actions/core';
import {
Inputs,
EnvVariableNames,
InputName,
DefaultInputs,
} from '../constants';
export function getInput(name: keyof Inputs): string {
const value = core.getInput(name);
if (name === InputName.Parallel) {
return value.toUpperCase() === 'TRUE' ? value : '';
}
return value || DefaultInputs[name] || '';
}
/**
* Update env variables associated with some inputs.
* See: https://github.com/actions/toolkit/blob/5b940ebda7e7b86545fe9741903c930bc1191eb0/packages/core/src/core.ts#L69-L77 .
*
* @param {Inputs} inputs - The new inputs to apply to the env variables.
*/
export function setInputs(inputs: Inputs): void {
for (const [name, value] of Object.entries(inputs)) {
const envName = EnvVariableNames.has(name)
? name
: `INPUT_${name.replace(/ /g, '_').toUpperCase()}`;
process.env[envName] = value;
}
}
/**
* Apply new inputs and execute a runner function, restore them when done.
*
* @param {Inputs} inputs - The new inputs to apply to the env variables before
* excuting the runner.
* @param {runner} runner - The runner function that returns a promise.
* @returns {Promise<any>} - The result from the runner function.
*/
export async function applyInputs(
inputs: Inputs,
runner: () => Promise<void>,
): Promise<any> {
const originalInputs: Inputs = Object.fromEntries(
Object.keys(inputs).map(name => [
name,
EnvVariableNames.has(name) ? process.env[name] : core.getInput(name),
]),
);
exports.setInputs(inputs);
const result = await runner();
exports.setInputs(originalInputs);
return result;
}
export function maybeArrayToString(input: string[] | string) {
return Array.isArray(input) ? input.join('\n') : input;
}

View File

@ -1,19 +0,0 @@
{
"compilerOptions": {
"target": "es6",
"module": "commonjs",
"lib": ["esnext"],
"moduleResolution": "node",
"outDir": "./lib",
"rootDir": ".",
"strict": true,
"noImplicitAny": true,
"esModuleInterop": true,
"preserveSymlinks": true
},
"include": [
"./src",
"./node_modules/@actions"
],
"exclude": ["**/*.test.ts", "__tests__"]
}

@ -0,0 +1 @@
Subproject commit ce177499ccf9fd2aded3b0426c97e5434c2e8a73

View File

@ -1,3 +0,0 @@
dist/
lib/
node_modules/

View File

@ -1,58 +0,0 @@
{
"plugins": ["jest", "@typescript-eslint"],
"extends": ["plugin:github/es6"],
"parser": "@typescript-eslint/parser",
"parserOptions": {
"ecmaVersion": 9,
"sourceType": "module",
"project": "./tsconfig.json"
},
"rules": {
"eslint-comments/no-use": "off",
"import/no-namespace": "off",
"no-unused-vars": "off",
"@typescript-eslint/no-unused-vars": "error",
"@typescript-eslint/explicit-member-accessibility": ["error", {"accessibility": "no-public"}],
"@typescript-eslint/no-require-imports": "error",
"@typescript-eslint/array-type": "error",
"@typescript-eslint/await-thenable": "error",
"@typescript-eslint/ban-ts-ignore": "error",
"camelcase": "off",
"@typescript-eslint/camelcase": "error",
"@typescript-eslint/class-name-casing": "error",
"@typescript-eslint/explicit-function-return-type": ["error", {"allowExpressions": true}],
"@typescript-eslint/func-call-spacing": ["error", "never"],
"@typescript-eslint/generic-type-naming": ["error", "^[A-Z][A-Za-z]*$"],
"@typescript-eslint/no-array-constructor": "error",
"@typescript-eslint/no-empty-interface": "error",
"@typescript-eslint/no-explicit-any": "error",
"@typescript-eslint/no-extraneous-class": "error",
"@typescript-eslint/no-for-in-array": "error",
"@typescript-eslint/no-inferrable-types": "error",
"@typescript-eslint/no-misused-new": "error",
"@typescript-eslint/no-namespace": "error",
"@typescript-eslint/no-non-null-assertion": "warn",
"@typescript-eslint/no-object-literal-type-assertion": "error",
"@typescript-eslint/no-unnecessary-qualifier": "error",
"@typescript-eslint/no-unnecessary-type-assertion": "error",
"@typescript-eslint/no-useless-constructor": "error",
"@typescript-eslint/no-var-requires": "error",
"@typescript-eslint/prefer-for-of": "warn",
"@typescript-eslint/prefer-function-type": "warn",
"@typescript-eslint/prefer-includes": "error",
"@typescript-eslint/prefer-interface": "error",
"@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error",
"@typescript-eslint/require-array-sort-compare": "error",
"@typescript-eslint/restrict-plus-operands": "error",
"semi": "off",
"@typescript-eslint/semi": ["error", "never"],
"@typescript-eslint/type-annotation-spacing": "error",
"@typescript-eslint/unbound-method": "error"
},
"env": {
"node": true,
"es6": true,
"jest/globals": true
}
}

View File

@ -1,36 +0,0 @@
name: "Test the build"
on: # rebuild any PRs and main branch changes
pull_request:
push:
jobs:
pre-commit: # make sure pre-commits work properly
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.6
- name: Cache npm env
uses: actions/cache@v2
env:
cache-name: cache-npm-v1
with:
path: node_modules
key: ${{ env.cache-name }}-${{ github.job }}-${{ hashFiles('package.json','package-lock.json') }}
- name: "Install dependencies for npm"
run: |
npm ci
- name: Cache pre-commit env
uses: actions/cache@v2
env:
cache-name: cache-pre-commit-v1
with:
path: ~/.cache/pre-commit
key: ${{ env.cache-name }}-${{ github.job }}-${{ hashFiles('.pre-commit-config.yaml') }}
- name: "Install pre-commit"
run: |
pip install pre-commit
- name: "Run pre-commit"
run: |
pre-commit run --all-files --show-diff-on-failure --color always

View File

@ -1,101 +0,0 @@
# Dependency directory
node_modules
# Rest pulled from https://github.com/github/gitignore/blob/master/Node.gitignore
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
jspm_packages/
# TypeScript v1 declaration files
typings/
# TypeScript cache
*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
# parcel-bundler cache (https://parceljs.org/)
.cache
# next.js build output
.next
# nuxt.js build output
.nuxt
# vuepress build output
.vuepress/dist
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# OS metadata
.DS_Store
Thumbs.db
# Ignore built ts files
__tests__/runner/*
lib/**/*
.idea

View File

@ -1,47 +0,0 @@
---
default_stages: [commit, push]
default_language_version:
# force all unspecified python hooks to run python3
python: python3
minimum_pre_commit_version: "1.20.0"
repos:
- repo: https://github.com/Lucas-C/pre-commit-hooks
rev: v1.1.7
hooks:
- id: forbid-tabs
exclude: ^dist/index.js$
- repo: https://github.com/thlorenz/doctoc.git
rev: v1.4.0
hooks:
- id: doctoc
name: Add TOC for md files
files: ^README\.md$|^CONTRIBUTING\.md$|^UPDATING.md$|^dev/README\.md$|^dev/BACKPORT_PACKAGES.md$
- repo: meta
hooks:
- id: check-hooks-apply
- repo: https://github.com/adrienverge/yamllint
rev: v1.23.0
hooks:
- id: yamllint
name: Check yaml files with yamllint
entry: yamllint -c yamllint-config.yml
types: [yaml]
exclude: ^.*init_git_sync\.template\.yaml$|^.*airflow\.template\.yaml$|^chart/templates/.*\.yaml$
- repo: local
hooks:
- id: build
name: Build package for distribution
language: system
entry: bash -c "npm run release"
files: .*\.ts$
require_serial: true
pass_filenames: false
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.1.0
hooks:
- id: check-merge-conflict
- id: detect-private-key
- id: end-of-file-fixer
exclude: ^dist/.*
- id: trailing-whitespace
exclude: ^dist/.*

View File

@ -1,3 +0,0 @@
dist/
lib/
node_modules/

View File

@ -1,11 +0,0 @@
{
"printWidth": 80,
"tabWidth": 2,
"useTabs": false,
"semi": false,
"singleQuote": true,
"trailingComma": "none",
"bracketSpacing": false,
"arrowParens": "avoid",
"parser": "typescript"
}

View File

@ -1,22 +0,0 @@
The MIT License (MIT)
Copyright (c) 2018 GitHub, Inc. and contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@ -1,731 +0,0 @@
<p><a href="https://github.com/potiuk/cancel-workflow-runs/actions">
<img alt="cancel-workflow-runs status"
src="https://github.com/potiuk/cancel-workflow-runs/workflows/Test%20the%20build/badge.svg"></a>
# Cancel Workflow Runs action
<!-- START doctoc generated TOC please keep comment here to allow auto update -->
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)*
- [Context and motivation](#context-and-motivation)
- [Usage](#usage)
- [Inputs and outputs](#inputs-and-outputs)
- [Inputs](#inputs)
- [Outputs](#outputs)
- [Most often used canceling example](#most-often-used-canceling-example)
- [More Examples](#more-examples)
- [Repositories that use Pull Requests from forks](#repositories-that-use-pull-requests-from-forks)
- [Cancel duplicate runs for the source workflow](#cancel-duplicate-runs-for-the-source-workflow)
- [Cancel duplicate jobs for triggered workflow](#cancel-duplicate-jobs-for-triggered-workflow)
- [Cancel the "self" source workflow run](#cancel-the-self-source-workflow-run)
- [Cancel the "self" triggered workflow run](#cancel-the-self-triggered-workflow-run)
- [Fail-fast source workflow runs with failed jobs](#fail-fast-source-workflow-runs-with-failed-jobs)
- [Fail-fast source workflow runs with failed jobs and corresponding triggered runs](#fail-fast-source-workflow-runs-with-failed-jobs-and-corresponding-triggered-runs)
- [Fail-fast for triggered workflow runs with failed jobs](#fail-fast-for-triggered-workflow-runs-with-failed-jobs)
- [Cancel another workflow run](#cancel-another-workflow-run)
- [Cancel all duplicates for named jobs](#cancel-all-duplicates-for-named-jobs)
- [Repositories that do not use Pull Requests from forks](#repositories-that-do-not-use-pull-requests-from-forks)
- [Cancel duplicate runs for "self" workflow](#cancel-duplicate-runs-for-self-workflow)
- [Cancel "self" workflow run](#cancel-self-workflow-run)
- [Fail-fast workflow runs with failed jobs](#fail-fast-workflow-runs-with-failed-jobs)
- [Cancel all runs with named jobs](#cancel-all-runs-with-named-jobs)
- [Development environment](#development-environment)
- [License](#license)
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
# Context and motivation
Cancel Workflow Runs is an action that utilizes `workflow_run` triggers in order to perform various
run cancel operations. The idea is to save number of jobs and free them for other queued runs. It is
particularly useful in case your projects development flow where contributors submit pull requests
from forks. Using `workflow_run` trigger enables safe canceling of runs triggered by such pull requests.
In case your CI takes a lot of time and uses a lot of jobs, the action might help your project
to reduce job usage and decrease waiting time as it detects and cancels runs that are still executed,
but we know already they are superseded by newer runs.
The main purpose of this action is canceling duplicated runs for the same branch as the current run,
effectively limiting the resource consumption of the workflow to one run per branch. In short, the action
is useful if you want to limit jobs usage on GitHub Actions in case of the usage pattern
when fixups/rebases are pushed in quick succession to the same branch (fast iterations on a Pull Request).
This is achieved by `duplicates` cancel mode. The `duplicates` mode only cancels "past" runs - it does
not take into account runs that were started after the "current" run.
Another use case is to cancel the `pull_request` corresponding to the `workflow_run` triggered run.
This can happen when the triggered `workflow_run` finds that it makes no sense to proceed with
the source run. This is achieved by `self` cancel mode.
There are also two supplementary cancel modes for the action. Those supplementary use cases allow for further
optimisations - failing fast in case we detect that important job failed and canceling duplicates of the
`workflow_run` triggered events in case they execute some heavy jobs. This is achieved by `failedJobs` and
`namedJobs` cancel modes.
Note that `namedjobs` cancel mode is solely for the purpose of bypassing current limitations
of GitHub Actions. Currently, there is no way to retrieve connection between triggering and triggered
workflow in case of `workflow_run`, as well as retrieving repository and branch of the triggering
workflow. The action uses workaround - it requires designing workflows in the way that they pass necessary
information via carefully crafted job names. The job names are accessible via GitHub API, and they can be
resolved during execution of the workflow using information about the linked workflow available
at the workflow runtime. Hopefully this information will soon be available in GitHub Actions allowing
removal of `namedJobs` cancel mode and simplifying the examples and workflows using the Action.
Another feature of the Action is to notify the PRs linked to the workflows. Normally when workflows
get cancelled there is no information why it happens, but this action can add an explanatory comment
to the PR if the PR gets cancelled. This is controlled by `notifyPRCancel` boolean input.
Also, for the `workflow_run` events, GitHub does not yet provide an easy interface linking the original
Pull Request and the Workflow_run. You can ask the CancelWorkflowRun action to add extra comment to the PR
adding explanatory message followed by a link to the `workflow_run` run.
You can take a look at the description provided in the
[Apache Airflow's CI](https://github.com/apache/airflow/blob/master/CI.rst) and
[the workflows](https://github.com/apache/airflow/blob/master/.github/workflows)
Started from simple cancel workflow developed by [n1hility](https://github.com/n1hility)
that implemented cancelling previous runs before introducing `workflow_run` type of event by
GitHub Actions: [Cancel](https://github.com/n1hility/cancel-previous-runs).
# Usage
If you want a comprehensive solution, you should use the action as follows:
1) In case your project does not use public forks, it's enough to have one action with the `duplicates`
cancel mode in the workflow. This is a rare thing in open-source projects (usually those projects
accept pull requests from forks) and more often applicable for private repositories.
2) If you use forks, you should create a separate "Cancelling" `workflow_run` triggered workflow.
The `workflow_run` should be responsible for all canceling actions. The examples below show
the possible ways the action can be utilized.
# Inputs and outputs
## Inputs
| Input | Required | Default | Comment |
|--------------------------|----------|--------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `token` | yes | | The github token passed from `${{ secrets.GITHUB_TOKEN }}` |
| `cancelMode` | no | `duplicates` | The mode to run cancel on. The available options are `duplicates`, `self`, `failedJobs`, `namedJobs` |
| `cancelFutureDuplicates` | no | true | In case of duplicate canceling, cancel also future duplicates leaving only the "freshest" running job and not all the future jobs. By default it is set to true. |
| `sourceRunId` | no | | Useful only in `workflow_run` triggered events. It should be set to the id of the workflow triggering the run `${{ github.event.workflow_run.id }}` in case cancel operation should cancel the source workflow. |
| `notifyPRCancel` | no | | Boolean. If set to true, it notifies the cancelled PRs with a comment containing reason why they are being cancelled. |
| `notifyPRCancelMessage` | no | | Optional cancel message to use instead of the default one when notifyPRCancel is true. It is only used in 'self' cancelling mode. |
| `notifyPRMessageStart` | no | | Only for workflow_run events triggered by the PRs. If not empty, it notifies those PRs with the message specified at the start of the workflow - adding the link to the triggered workflow_run. |
| `jobNameRegexps` | no | | An array of job name regexps. Only runs containing any job name matching any of of the regexp in this array are considered for cancelling in `failedJobs` and `namedJobs` and `allDuplicateNamedJobs` modes. |
| `skipEventTypes` | no | | Array of event names that should be skipped when cancelling (JSON-encoded string). This might be used in order to skip direct pushes or scheduled events. |
| `selfPreservation` | no | true | Do not cancel self. |
| `workflowFileName` | no | | Name of the workflow file. It can be used if you want to cancel a different workflow than yours. |
The job cancel modes work as follows:
| Cancel Mode | No `sourceRunId` specified | The `sourceRunId` set to `${{ github.event.workflow_run.id }}` |
|--------------------------|------------------------------------------------------------------------------|-------------------------------------------------------------------------------------|
| `duplicates` | Cancels duplicate runs from the same repo/branch as current run. | Cancels duplicate runs for the same repo/branch as the source run. |
| `allDuplicates` | Cancels duplicate runs from all running workflows. | Cancels duplicate runs from all running workflows. |
| `self` | Cancels self run. | Cancel the `sourceRunId` run. |
| `failedJobs` | Cancels all runs of own workflow that have matching jobs that failed. | Cancels all runs of the `sourceRunId` workflow that have matching jobs that failed. |
| `namedJobs` | Cancels all runs of own workflow that have matching jobs. | Cancels all runs of the `sourceRunId` workflow that have matching jobs. |
| `allDuplicatedNamedJobs` | Cancels all duplicate runs of own workflow that share matching jobs pattern. | Cancels all runs of the `sourceRunId` workflow that share matching job pattern. |
## Outputs
| Output | No `sourceRunId` specified | The `sourceRunId` set to `${{ github.event.workflow_run.id }}` |
|---------------------|---------------------------------------------------------|------------------------------------------------------------------------------------------------------|
| `sourceHeadRepo` | Current repository. Format: `owner/repo` | Repository of the run that triggered this `workflow_run`. Format: `owner/repo` |
| `sourceHeadBranch` | Current branch. | Branch of the run that triggered this `workflow_run`. Might be forked repo, if it is a pull_request. |
| `sourceHeadSha` | Current commit SHA: `{{ github.sha }}` | Commit sha of the run that triggered this `workflow_run`. |
| `mergeCommitSha` | Merge commit SHA if PR-triggered event. | Merge commit SHA if PR-triggered event. |
| `targetCommitSha` | Target commit SHA (merge if present, otherwise source). | Target commit SHA (merge if present, otherwise source). |
| `pullRequestNumber` | Number of the associated Pull Request (if PR triggered) | Number of the associated Pull Request (if PR triggered) |
| `sourceEvent` | Current event: ``${{ github.event }}`` | Event of the run that triggered this `workflow_run` |
| `cancelledRuns` | JSON-stringified array of cancelled run ids. | JSON-stringified array of cancelled run ids. |
## Most often used canceling example
The most common canceling example is that you want to cancel all duplicates appearing in your build queue.
As of 4.1 version of the Action this can be realised by single workflow run that can cancel all duplicates
for all running workflows. It is resistant to temporary queues - as it can cancel also the future, queued
workflows that have duplicated, fresher (also queued workflows and this is recommended for everyone.
The below example is a "workflow_run" type of event. The workflow_run event always has "write" access that allows
it to cancel other workflows - even if they are coming from pull request.
```yaml
name: Cancelling Duplicates
on:
workflow_run:
workflows: ['CI']
types: ['requested']
jobs:
cancel-duplicate-workflow-runs:
name: "Cancel duplicate workflow runs"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Cancel duplicate workflow runs"
with:
cancelMode: allDuplicates
token: ${{ secrets.GITHUB_TOKEN }}
sourceRunId: ${{ github.event.workflow_run.id }}
```
# More Examples
Note that you can combine the steps below in several steps of the same job. The examples here are showing
one step per case for clarity.
## Repositories that use Pull Requests from forks
Note that in case you implement separate "Canceling workflow", following the examples below, you do not
need to add cancel action to any other workflow. All Cancel actions should be configured in this
Cancelling workflow.
Those examples show how you should configure your project with separate `Cancelling` workflow which is
triggered via `workflow_run` trigger.
In the example belows we use the following names:
* **triggered workflow** - the "Cancelling" workflow - separate workflow triggered by the `workflow_run`
event. Its main job is to manage cancelling of other workflows.
* **triggered run** - the run of the *triggered workflow*. It is triggered by another ("source") run. In the
examples below, this run is in "Cancelling" workflow. It always runs in the context of the main repository,
even if it is triggered by a Pull Request from a fork.
* **source workflow** - the "main" workflow - main workflow that performs CI actions. In the examples below,
this is a "CI" workflow.
* **source run** - the run of the *source workflow*. It is the run that triggers the *triggered run*,
and it runs most of the CI tasks. In the examples below those are the runs of "CI" workflow.
### Cancel duplicate runs for the source workflow
Cancel past, duplicate *source runs* of the *source workflow*. This workflow cancels
duplicated, past runs (for the same branch/repo that those associated with the *source run* that triggered
the *triggered run*). You have to create it with the `sourceRunId` input with the value of
`${{ github.event.workflow_run.id }}` in order to work correctly.
In the example below, the `Canceling` run cancels past, duplicate runs from the `CI` with the same
branch/repo as the *source run* which triggered it - effectively what's left after the action is only
the latest *source run* of "CI" from the same branch/repo.
This works for all kind of triggering events (`push`, `pull_request`, `schedule` ...). It works for
events triggered in the local repository, as well as triggered from the forks, so you do not need
to set up any extra actions to cancel internal Pushes/Pull Requests.
You can also choose to skip certain types of events (for example `push` and `schedule` if you want your
jobs to run to full completion for this kind of events.
```yaml
name: Cancelling
on:
workflow_run:
workflows: ['CI']
types: ['requested']
jobs:
cancel-duplicate-workflow-runs:
name: "Cancel duplicate workflow runs"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Cancel duplicate workflow runs"
with:
cancelMode: duplicates
cancelFutureDuplicates: true
token: ${{ secrets.GITHUB_TOKEN }}
sourceRunId: ${{ github.event.workflow_run.id }}
notifyPRCancel: true
skipEventTypes: '["push", "schedule"]'
```
Note that `duplicate` cancel mode cannot be used for `workflow_run` type of event without `sourceId` input.
The action will throw an error in this case because it is not really doing what you would expect it to do.
All `workflow_run` events have the same branch and repository (they are all run in the context of the
target branch and repository) no matter what is the source of the event, therefore cancelling duplicates
would cancel all the runs originated from all the branches and this is not really expected.
If you want to cancel duplicate runs of the *triggered workflow*, you need to utilize the
`namedJob` cancel mode as described in the next chapter
[Cancel duplicate jobs for triggered workflow](#cancel-duplicate-jobs-for-triggered-workflow) using outputs
from the duplicate canceling for *source workflow* run above.
Hopefully we will have an easier way of doing that in the future once GitHub Actions API will allow
searching for source runs (it's not available at this moment).
### Cancel duplicate jobs for triggered workflow
Cancels all past runs from the *triggered workflow* if any of the job names match any of the regular
expressions. Note that it does not take into account the branch of the runs. It will cancel all runs
with matching job names no mater the branch/repo.
This example is much more complex. It shows the actual case on how you can design your jobs using with
using outputs from the cancel duplicate action and running subsequent cancel with namedJobs cancel
mode. Hopefully in the future better solution will come from Github Actions and such cancel flow will
be natively supported by GitHub Actions but as of now (August 2020) such native support is not
possible. The example below uses specially named jobs that contain Branch, Repo and Run id of
the triggering run. The cancel operation finds the runs that have jobs with the names following
pattern containing the same repo and branch as the source run branch and repo in order to cancel duplicates.
In the case below, this workflow will first cancel the "CI" duplicate runs from the same branch and then
it will cancel the runs from the Cancelling workflow which contain the same repo and branch as
in job names, effectively implementing cancelling duplicate runs for the Cancelling workflow.
```yaml
name: Cancelling
on:
workflow_run:
workflows: ['CI']
types: ['requested']
jobs:
cancel-duplicate-ci-runs:
name: "Cancel duplicate CI runs"
runs-on: ubuntu-latest
outputs:
sourceHeadRepo: ${{ steps.cancel.outputs.sourceHeadRepo }}
sourceHeadBranch: ${{ steps.cancel.outputs.sourceHeadBranch }}
sourceHeadSha: ${{ steps.cancel.outputs.sourceHeadSha }}
sourceEvent: ${{ steps.cancel.outputs.sourceEvent }}
steps:
- uses: potiuk/cancel-workflow-runs@master
id: cancel
name: "Cancel duplicate CI runs"
with:
cancelMode: duplicates
cancelFutureDuplicates: true
token: ${{ secrets.GITHUB_TOKEN }}
notifyPRCancel: true
notifyPRMessageStart: |
Note! The Docker Images for the build are prepared in a separate workflow,
that you will not see in the list of checks.
You can checks the status of those images in:
- uses: potiuk/cancel-workflow-runs@master
name: "Cancel duplicate Cancelling runs"
with:
cancelMode: namedJobs
token: ${{ secrets.GITHUB_TOKEN }}
notifyPRCancel: true
jobNameRegexps: >
["Build info
repo: ${{ steps.cancel.outputs.sourceHeadRepo }}
branch: ${{ steps.cancel.outputs.sourceHeadBranch }}.*"]
build-info:
name: >
Build info
repo: ${{ needs.cancel-workflow-runs.outputs.sourceHeadRepo }}
branch: ${{ needs.cancel-workflow-runs.outputs.sourceHeadBranch }}
runs-on: ubuntu-latest
needs: [cancel-duplicate-ci-runs]
env:
GITHUB_CONTEXT: ${{ toJson(github) }}
steps:
- name: >
[${{ needs.cancel-workflow-runs.outputs.sourceEvent }}] will checkout
Run id: ${{ github.run_id }}
Source Run id: ${{ github.event.workflow_run.id }}
Sha: ${{ needs.cancel-workflow-runs.outputs.sourceHeadSha }}
Repo: ${{ needs.cancel-workflow-runs.outputs.sourceHeadRepo }}
Branch: ${{ needs.cancel-workflow-runs.outputs.sourceHeadBranch }}
run: |
printenv
```
### Cancel the "self" source workflow run
This is useful in case you decide to cancel the *source run* that triggered the *triggered run*.
In the case below, the step cancels the `CI` workflow that triggered the `Cancelling` run.
```yaml
name: Cancelling
on:
workflow_run:
workflows: ['CI']
types: ['requested']
cancel-self-source-workflow-run:
name: "Cancel the self CI workflow run"
runs-on: ubuntu-latest
steps:
- name: "Cancel the self CI workflow run"
uses: potiuk/cancel-workflow-runs@master
with:
cancelMode: self
notifyPRCancel: true
notifyPRCancelMessage: Cancelled because image building failed.
token: ${{ secrets.GITHUB_TOKEN }}
sourceRunId: ${{ github.event.workflow_run.id }}
```
### Cancel the "self" triggered workflow run
This is useful in case you decide to cancel the *triggered run*. The difference vs. previous case is that
you do not specify the `sourceRunId` input.
In the case below - self workflow will be cancelled.
```yaml
name: Cancelling
on:
workflow_run:
workflows: ['CI']
types: ['requested']
cancel-self-cancelling-run:
name: "Cancel the self Canceling workflow run"
runs-on: ubuntu-latest
steps:
- name: "Cancel the self Cancelling workflow run"
uses: potiuk/cancel-workflow-runs@master
with:
cancelMode: self
notifyPRCancel: true
token: ${{ secrets.GITHUB_TOKEN }}
```
Note that if you want to cancel both - source workflow and self workflow you need to first cancel
the source workflow, and then cancel the self one, not the other way round :).
### Fail-fast source workflow runs with failed jobs
Cancels all runs from the *source workflow* if there are failed jobs matching any of the regular expressions.
Note that the action does not take into account the branch/repos of the runs. It will cancel all runs
with failed jobs no mater the branch/repo.
In the case below, if any of `CI` workflow runs (even with different branch heads) have failed jobs
names matching `^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp - they
will be cancelled.
```yaml
name: Cancelling
on:
workflow_run:
workflows: ['CI']
types: ['requested']
jobs:
fail-fast-triggered-workflow-named-jobs-runs:
name: "Fail fast CI runs"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Fail fast CI runs"
with:
cancelMode: failedJobs
token: ${{ secrets.GITHUB_TOKEN }}
sourceRunId: ${{ github.event.workflow_run.id }}
notifyPRCancel: true
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
```
Note that if you not only want to cancel the failed triggering workflows but also
the want to fail the corresponding "Cancelling" workflows, you need to implement the solution
described in the next chapter.
### Fail-fast source workflow runs with failed jobs and corresponding triggered runs
Cancels all runs from the *source workflow* if there are failed jobs matching any of the regular expressions,
also cancels the corresponding *triggered runs*.
Note that the action does not take into account the branch/repos of the runs. It will cancel all runs
with failed jobs no mater the branch/repo.
In the case below, if any of `CI` workflow runs (even with different branch heads) have failed jobs
names matching `^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp - they
will be cancelled as well as the corresponding "Cancelling" workflow runs.
There is no native support yet in GitHub actions to do it easily, so the example below shows how this can be
achieved using `namedJobs` and output returned from the previous `Cancel Workflow Runs` action. Hopefull
this will be simplified when GitHub Actions introduce native support for it.
```yaml
name: Cancelling
on:
workflow_run:
workflows: ['CI']
types: ['requested']
jobs:
fail-fast-triggered-workflow-named-jobs-runs:
name: "Fail fast CI runs"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Fail fast CI. Source run: ${{ github.event.workflow_run.id }}"
id: cancel-failed
with:
cancelMode: failedJobs
token: ${{ secrets.GITHUB_TOKEN }}
sourceRunId: ${{ github.event.workflow_run.id }}
notifyPRCancel: true
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
- name: "Extract canceled failed runs"
id: extract-cancelled-failed-runs
if: steps.cancel-failed.outputs.cancelledRuns != '[]'
run: |
REGEXP="Fail fast CI. Source run: "
SEPARATOR=""
for run_id in $(echo "${{ steps.cancel-failed.outputs.cancelledRuns }}" | jq '.[]')
do
REGEXP="${REGEXP}${SEPARATOR}(${run_id})"
SEPARATOR="|"
done
echo "::set-output name=matching-regexp::${REGEXP}"
- name: "Cancel triggered 'Cancelling' runs for the cancelled failed runs"
if: steps.cancel-failed.outputs.cancelledRuns != '[]'
uses: potiuk/cancel-workflow-runs@master
with:
cancelMode: namedJobs
token: ${{ secrets.GITHUB_TOKEN }}
notifyPRCancel: true
jobNameRegexps: ${{ steps.extract-cancelled-failed.runs.matching-regexp }}
```
Note that if you not only want to cancel the failed triggering workflows but also
the want to fail the corresponding "Cancelling" workflows, you need to implement the solution
described in the next chapter.
### Fail-fast for triggered workflow runs with failed jobs
Cancels all runs from the *triggered workflow* if there are failed jobs matching any of the regular
expressions. Note that it does not take into account the branch/repos of the runs. It will cancel all runs
with failed jobs no mater the branch/repo.
In the case below, if any of `Cancelling` workflow runs (even with different branch heads) have failed jobs
names matching `^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp - they
will be cancelled.
```yaml
name: Cancelling
on:
workflow_run:
workflows: ['CI']
types: ['requested']
jobs:
fail-fast-triggered-workflow-named-jobs-runs:
name: "Fail fast Canceling runs"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Fail fast Canceling runs"
with:
cancelMode: failedJobs
token: ${{ secrets.GITHUB_TOKEN }}
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
```
### Cancel another workflow run
This is useful in case you decide to cancel the *source run* that triggered the *triggered run*.
In the case below, the step cancels the `CI` workflow that triggered the `Cancelling` run.
```yaml
name: Cancelling
on:
workflow_run:
workflows: ['CI']
types: ['requested']
cancel-other-workflow-run:
name: "Cancel the self CI workflow run"
runs-on: ubuntu-latest
steps:
- name: "Cancel the self CI workflow run"
uses: potiuk/cancel-workflow-runs@master
with:
cancelMode: duplicates
cancelFutureDuplicates: true
token: ${{ secrets.GITHUB_TOKEN }}
workflowFileName: other_workflow.yml
```
### Cancel all duplicates for named jobs
Cancels all duplicated runs for all jobs that match specified regular expression.
Note that it does not take into account the branch of the runs. It will cancel all duplicates with
the same match for jobs, no matter what branch originated it.
This is useful in case of job names generated dynamically.
In the case below, for all the runs that have job names generated containing Branch/Repo/Event combination
that have the same match, the duplicates will get cancelled leaving only the most recent run for each exact
match.
Note that the match must be identical. If there are two jobs that have a different Branch
they will both match the same pattern, but they are not considered duplicates.
Also, this is one of the jobs It has also self-preservation turned off.
This means that in case the job determines that it is itself a duplicate it will cancel itself. That's
why checking for duplicates of self-workflow should be the last step in the cancelling process.
```yaml
on:
push:
workflow_run:
workflows: ['CI']
types: ['requested']
jobs:
cancel-self-failed-runs:
name: "Cancel the self workflow run"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Cancel past CI runs"
with:
cancelMode: allDuplicatedNamedJobs
token: ${{ secrets.GITHUB_TOKEN }}
jobNameRegexps: '["Branch: .* Repo: .* Event: .* "]'
selfPreservation: false
notifyPRCancel: true
```
## Repositories that do not use Pull Requests from forks
Note that examples in this chapter only work if you do not have Pull Requests coming from forks (so for
example if you only work in a private repository). When those action runs within the usual `pull_request`
triggered runs coming from a fork, they have not enough permissions to cancel running workflows.
If you want to cancel `pull_requests` from forks, you need to use `workflow_run` triggered runs - see the
[Repositories that use Pull Requests from fork](#repositories-that-use-pull-requests-from-forks) chapter.
Note that in case you configure the separate `workflow_run` Cancelling workflow, there is no need to add
the action to the "source" workflows. The "Canceling workflow" pattern handles well not only Pull Requests
from the forks, but also all other cases - including cancelling Pull Requests for the same repository
and canceling scheduled runs.
### Cancel duplicate runs for "self" workflow
Cancels past runs for the same workflow (with the same branch).
In the case below, any of the direct "push" events will cancel all past runs for the same branch as the
one being pushed. However, it can be configured for "pull_request" (in the same repository) or "schedule"
type of events as well. It will also notify the PR with the comment containining why it has been
cancelled.
```yaml
name: CI
on: push
jobs:
cancel-duplicate-workflow-runs:
name: "Cancel duplicate workflow runs"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Cancel duplicate workflow runs"
with:
cancelMode: duplicates
cancelFutureDuplicates: true
notifyPRCancel: true
```
### Cancel "self" workflow run
This is useful in case you decide to cancel "self" run.
In the case below - own workflow will be cancelled immediately. It can be configured for "push",
"pull_request" (from the same repository) or "schedule" type of events.
```yaml
name: CI
on: push
jobs:
cancel-self-run:
name: "Cancel the self workflow run"
runs-on: ubuntu-latest
steps:
- name: "Cancel the self workflow run"
uses: potiuk/cancel-workflow-runs@master
with:
cancelMode: self
token: ${{ secrets.GITHUB_TOKEN }}
notifyPRCancel: true
```
### Fail-fast workflow runs with failed jobs
Cancels all runs (including self run!) if they have failed jobs matching any of the regular expressions.
Note that it does not take into account the branch of the running jobs. It will cancel all runs with failed
jobs, no matter what branch originated it.
In the case below, if any of the own workflow runs have failed jobs matching any of the
`^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp, this workflow will cancel the runs.
```yaml
name: CI
on:
push:
jobs:
cancel-self-failed-runs:
name: "Cancel failed runs"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Cancel failed runs"
with:
cancelMode: failedJobs
token: ${{ secrets.GITHUB_TOKEN }}
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
notifyPRCancel: true
```
### Cancel all runs with named jobs
Cancels all runs (including self run!) if any of the job names match any of the regular
expressions. Note that it does not take into account the branch of the runs. It will cancel all runs with
matching jobs, no matter what branch originated it.
This is useful in case of job names generated dynamically.
In the case below, if any of the "self" workflow runs has job names that matches any of the
`^Static checks$` and `^Build docs^` or `^Build prod image .*` regexp, this workflow will cancel the runs.
```yaml
on:
push:
workflow_run:
workflows: ['CI']
types: ['requested']
jobs:
cancel-self-failed-runs:
name: "Cancel the self workflow run"
runs-on: ubuntu-latest
steps:
- uses: potiuk/cancel-workflow-runs@master
name: "Cancel past CI runs"
with:
cancelMode: namedJobs
token: ${{ secrets.GITHUB_TOKEN }}
jobNameRegexps: '["^Static checks$", "^Build docs$", "^Build prod image.*"]'
notifyPRCancel: true
```
## Development environment
It is highly recommended tu use [pre commit](https://pre-commit.com). The pre-commits
installed via pre-commit tool handle automatically linting (including automated fixes) as well
as building and packaging Javascript index.js from the main.ts Typescript code, so you do not have
to run it yourself.
## License
[MIT License](LICENSE) covers the scripts and documentation in this project.

View File

@ -1,5 +0,0 @@
import * as process from 'process'
import * as cp from 'child_process'
import * as path from 'path'
test('no op', () => {})

View File

@ -1,74 +0,0 @@
name: 'Cancel Workflow Runs'
description: 'Cancel Workflow Runs - duplicates, failed, named in order to limit job usage,'
author: 'potiuk'
inputs:
token:
description: The GITHUB_TOKEN secret of the repository
required: true
sourceRunId:
description: |
The run that triggered the action. It should be set to
`$\{\{ github.event.workflow_run.id` variable \}\}` if used in `workflow_run` triggered run if
you want to act on source workflow rather than the triggered run.
required: false
notifyPRCancel:
description: |
Boolean. If set to true, it notifies the cancelled PRs with a comment containing reason why
they are being cancelled.
required: false
notifyPRCancelMessage:
description: |
Optional cancel message to use instead of the default one when notifyPRCancel is true. Only
used in 'self' cancel mode.
required: false
notifyPRMessageStart:
description: |
Only for workflow_run events triggered by the PRs. If not empty, it notifies those PRs with the
message specified at the start of the workflow - adding the link to the triggered workflow_run.
required: false
cancelMode:
description: |
The mode of cancel. One of:
* `duplicates` - cancels duplicate runs from the same repo/branch as local run or
sourceId workflow. This is the default mode when cancelMode is not specified.
* `allDuplicates` - cancels duplicate runs from all workflows. It is more aggressive version of
duplicate canceling - as it cancels all duplicates. It is helpful in case
of long queues of builds - as it is enough that one of the workflows that
cancel duplicates is executed, it can effectively clean-up the queue in this
case for all the future, queued runs.
* `self` - cancels self run - either own run if sourceRunId is not set, or
the source run that triggered the `workflow_run'
* `failedJobs` - cancels all runs that failed in jobs matching one of the regexps
* `namedJobs` - cancels runs where names of some jobs match some of regexps
required: false
cancelFutureDuplicates:
description: |
In case of duplicate canceling, cancel also future duplicates leaving only the "freshest" running
job and not all the future jobs. By default it is set to true.
required: false
selfPreservation:
description: |
Do not cancel your own run. There are cases where selfPreservation should be disabled but it is
enabled by default. You can disable it by setting 'false' as value.
required: false
jobNameRegexps:
description: |
Array of job name regexps (JSON-encoded string). Used by `failedJobs` and `namedJobs` cancel modes
to match job names of workflow runs.
required: false
skipEventTypes:
description: |
Array of event names that should be skipped when cancelling (JSON-encoded string). This might be used
in order to skip direct pushes or scheduled events.
required: false
workflowFileName:
description: |
Name of the workflow file. It can be used if you want to cancel a different workflow than yours.
required: false
runs:
using: 'node12'
main: 'dist/index.js'
branding:
icon: 'play'
color: 'blue'

View File

@ -1,11 +0,0 @@
module.exports = {
clearMocks: true,
moduleFileExtensions: ['js', 'ts'],
testEnvironment: 'node',
testMatch: ['**/*.test.ts'],
testRunner: 'jest-circus/runner',
transform: {
'^.+\\.ts$': 'ts-jest'
},
verbose: true
}

File diff suppressed because it is too large Load Diff

View File

@ -1,48 +0,0 @@
{
"name": "typescript-action",
"version": "0.0.0",
"private": true,
"description": "TypeScript template action",
"main": "lib/main.js",
"scripts": {
"build": "tsc",
"format": "prettier --write **/*.ts",
"format-check": "prettier --check **/*.ts",
"lint": "eslint src/**/*.ts",
"pack": "ncc build",
"test": "jest",
"all": "npm run build && npm run format && npm run lint && npm run pack && npm test",
"release": "ncc build -o dist src/main.ts"
},
"repository": {
"type": "git",
"url": "git+https://github.com/actions/typescript-action.git"
},
"keywords": [
"actions",
"node",
"setup"
],
"author": "YourNameOrOrganization",
"license": "MIT",
"dependencies": {
"@actions/core": "^1.2.2",
"@actions/github": "^2.1.0",
"jstreemap": "^1.28.2"
},
"devDependencies": {
"@types/jest": "^24.0.23",
"@types/node": "^12.7.12",
"@typescript-eslint/parser": "^2.8.0",
"@zeit/ncc": "^0.20.5",
"eslint": "^5.16.0",
"eslint-plugin-github": "^2.0.0",
"eslint-plugin-jest": "^22.21.0",
"jest": "^26.2.2",
"jest-circus": "^26.2.2",
"js-yaml": "^3.13.1",
"prettier": "^1.19.1",
"ts-jest": "^26.1.4",
"typescript": "^3.6.4"
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,12 +0,0 @@
{
"compilerOptions": {
"target": "es6", /* Specify ECMAScript target version: 'ES3' (default), 'ES5', 'ES2015', 'ES2016', 'ES2017', 'ES2018', 'ES2019' or 'ESNEXT'. */
"module": "commonjs", /* Specify module code generation: 'none', 'commonjs', 'amd', 'system', 'umd', 'es2015', or 'ESNext'. */
"outDir": "./lib", /* Redirect output structure to the directory. */
"rootDir": "./src", /* Specify the root directory of input files. Use to control the output directory structure with --outDir. */
"strict": true, /* Enable all strict type-checking options. */
"noImplicitAny": true, /* Raise error on expressions and declarations with an implied 'any' type. */
"esModuleInterop": true /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */
},
"exclude": ["node_modules", "**/*.test.ts"]
}

View File

@ -1,22 +0,0 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
---
extends: default
rules:
line-length:
max: 110

1
.github/actions/comment-on-pr vendored Submodule

@ -0,0 +1 @@
Subproject commit d1a1d5dd1eb1bb657a01f4d92dd5e4d5bb7857d3

View File

@ -1,13 +0,0 @@
FROM ruby:2.6.0
LABEL "com.github.actions.name"="Comment on PR"
LABEL "com.github.actions.description"="Leaves a comment on an open PR matching a push event."
LABEL "com.github.actions.repository"="https://github.com/unsplash/comment-on-pr"
LABEL "com.github.actions.maintainer"="Aaron Klaassen <aaron@unsplash.com>"
LABEL "com.github.actions.icon"="message-square"
LABEL "com.github.actions.color"="blue"
RUN gem install octokit
ADD entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]

View File

@ -1,7 +0,0 @@
Copyright 2019 Unsplash Inc.
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@ -1,27 +0,0 @@
# Comment on PR via GitHub Action
A GitHub action to comment on the relevant open PR when a commit is pushed.
## Usage
- Requires the `GITHUB_TOKEN` secret.
- Requires the comment's message in the `msg` parameter.
- Supports `push` and `pull_request` event types.
### Sample workflow
```
name: comment-on-pr example
on: pull_request
jobs:
example:
name: sample comment
runs-on: ubuntu-latest
steps:
- name: comment PR
uses: unsplash/comment-on-pr@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
msg: "Check out this message!"
```

View File

@ -1,15 +0,0 @@
name: Comment on PR
author: Aaron Klaassen <aaron@unsplash.com>
description: Leaves a comment on an open PR matching a push event.
branding:
icon: 'message-square'
color: 'blue'
inputs:
msg:
description: Comment's message
required: true
runs:
using: 'docker'
image: 'Dockerfile'
args:
- ${{ inputs.msg }}

View File

@ -1,47 +0,0 @@
#!/usr/bin/env ruby
require "json"
require "octokit"
json = File.read(ENV.fetch("GITHUB_EVENT_PATH"))
event = JSON.parse(json)
github = Octokit::Client.new(access_token: ENV["GITHUB_TOKEN"])
if !ENV["GITHUB_TOKEN"]
puts "Missing GITHUB_TOKEN"
exit(1)
end
if ARGV.empty?
puts "Missing message argument."
exit(1)
end
repo = event["repository"]["full_name"]
if ENV.fetch("GITHUB_EVENT_NAME") == "pull_request"
pr_number = event["number"]
else
pulls = github.pull_requests(repo, state: "open")
push_head = event["after"]
pr = pulls.find { |pr| pr["head"]["sha"] == push_head }
if !pr
puts "Couldn't find an open pull request for branch with head at #{push_head}."
exit(1)
end
pr_number = pr["number"]
end
message = ARGV.join(' ')
coms = github.issue_comments(repo, pr_number)
duplicate = coms.find { |c| c["user"]["login"] == "github-actions[bot]" && c["body"] == message }
if duplicate
puts "The PR already contains a database change notification"
exit(0)
end
github.add_comment(repo, pr_number, message)

@ -0,0 +1 @@
Subproject commit a6ca26c14274c33b15e6499323aac178af06ad4b

View File

@ -1,55 +0,0 @@
codecov:
notify:
require_ci_to_pass: yes
coverage:
notify:
slack:
default:
threshold: 1%
message: "Coverage {{changed}} for {{owner}}/{{repo}}" # customize the message
attachments: "sunburst, diff"
only_pulls: false
status:
src:
target: auto
threshold: 7%
base: auto
if_ci_failed: success
paths:
- src/
- '!src/tests/'
flags:
- src
test:
target: 60%
threshold: 10%
if_ci_failed: error
base: auto
paths:
- src/tests/
flags:
- test
precision: 2
round: down
range: "70...100"
flags:
src:
paths:
- src
- '!src/tests/'
test:
paths:
- src/tests/
parsers:
gcov:
branch_detection:
conditional: yes
loop: yes
method: no
macro: no
comment:
layout: "reach,diff,flags,tree"
behavior: default
require_changes: no

View File

@ -1,73 +0,0 @@
plugins:
- '@typescript-eslint'
- eslint-comments
- promise
- unicorn
extends:
- airbnb-typescript
- plugin:@typescript-eslint/recommended
- plugin:eslint-comments/recommended
- plugin:promise/recommended
- plugin:unicorn/recommended
- prettier
- prettier/@typescript-eslint
settings:
import/parsers:
'@typescript-eslint/parser':
- .ts
- .tsx
- .js
import/resolver:
typescript: {}
rules:
unicorn/filename-case: off
react/static-property-placement: 0
no-prototype-builtins: 0
import/prefer-default-export: 0
'@typescript-eslint/no-explicit-any': 0
import/no-default-export: error
no-use-before-define:
- error
-
functions: false
classes: true
variables: true
'@typescript-eslint/explicit-function-return-type':
- error
-
allowExpressions: true
allowTypedFunctionExpressions: true
'@typescript-eslint/no-use-before-define':
- error
-
functions: false
classes: true
variables: true
typedefs: true
'@typescript-eslint/indent':
- 2
- 2
unicorn/prevent-abbreviations: 0
import/no-extraneous-dependencies: [error, {devDependencies: ['**/*.ts']}]
parser: "@typescript-eslint/parser"
parserOptions:
project: ./tsconfig.json
ecmaVersion: 2019
sourceType: module
env:
node: true
browser: true
ignorePatterns:
- '*.js'
overrides:
- files: ['src/tests/**/*']
plugins:
- jest
extends:
- plugin:jest/recommended
rules:
global-require: 0
'@typescript-eslint/no-var-requires': 0
no-console: 0
'@typescript-eslint/no-unused-vars': 0
'@typescript-eslint/no-throw-literal': 0

View File

@ -1,3 +0,0 @@
# Contributing
The repository is released under the MIT license, and follows a standard Github development process, using Github tracker for issues and merging pull requests into master.

View File

@ -1,17 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
---
**Describe the bug**
A clear and concise description of what the bug is.
**Workflow**
If applicable, provide a workflow file to help explain your problem.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Additional context**
Add any other context about the problem here.

View File

@ -1,17 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@ -1,14 +0,0 @@
### Type of Change
<!-- What type of change does your code introduce? -->
- [ ] New feature
- [ ] Bug fix
- [ ] Documentation
- [ ] Refactor
- [ ] Chore
### Resolves
- Fixes #[Add issue number here.]
### Describe Changes
<!-- Describe your changes in detail, if applicable. -->
_Describe what this Pull Request does_

View File

@ -1 +0,0 @@
.codecov.yml,.eslintignore,.eslintrc.json,.eslintrc.yml,.github/workflows/integration.yml,.github/workflows/pr.yml,.github/workflows/push.yml,.github/workflows/readme.md,.gitignore,.prettierignore,.prettierrc.json,.prettierrc.yml,.releaserc.yml,Makefile,README.md,__tests__/main.test.ts,action.yml,dist/index.js,jest.config.js,package.json,src/ChangedFiles.ts,src/File.ts,src/FilesHelper.ts,src/GithubHelper.ts,src/InputHelper.ts,src/UtilsHelper.ts,src/main.ts,src/tests/FilesHelper.test.ts,src/tests/GithubHelper.test.ts,src/tests/InputHelper.test.ts,src/tests/UtilsHelper.test.ts,src/tests/main.test.ts,src/tests/mocks/core/index.test.ts,src/tests/mocks/core/index.ts,src/tests/mocks/env/events/issue_comment_created.json,src/tests/mocks/env/events/issue_comment_edited.json,src/tests/mocks/env/events/pull_request_opened.json,src/tests/mocks/env/events/pull_request_reopened.json,src/tests/mocks/env/events/pull_request_synchronize.json,src/tests/mocks/env/events/push.json,src/tests/mocks/env/events/push_merge.json,src/tests/mocks/env/events/schedule.json,src/tests/mocks/env/index.test.ts,src/tests/mocks/env/index.ts,src/tests/mocks/fs/index.test.ts,src/tests/mocks/fs/index.ts,src/tests/mocks/github/index.test.ts,src/tests/mocks/github/index.ts,src/tests/mocks/octokit/endpoint/merge.test.ts,src/tests/mocks/octokit/endpoint/merge.ts,src/tests/mocks/octokit/index.test.ts,src/tests/mocks/octokit/index.ts,src/tests/mocks/octokit/paginate.test.ts,src/tests/mocks/octokit/paginate.ts,src/tests/mocks/octokit/payloads.ts,src/tests/mocks/octokit/pulls/listFiles.test.ts,src/tests/mocks/octokit/pulls/listFiles.ts,src/tests/mocks/octokit/repos/compareCommits.test.ts,src/tests/mocks/octokit/repos/compareCommits.ts,src/tests/payloads.ts,src/typings/ActionError/index.d.ts,src/typings/ChangedFiles/index.d.ts,src/typings/CoreMock/index.d.ts,src/typings/FsMock/index.d.ts,src/typings/GitHubFile/index.d.ts,src/typings/GitHubMock/index.d.ts,src/typings/Inferred/index.d.ts,src/typings/Inputs/index.d.ts,src/typings/OctokitMock/index.d.ts,src/typings/TestInput/index.d.ts,tsconfig.build.json,tsconfig.json,yarn.lock
1 .codecov.yml .eslintignore .eslintrc.json .eslintrc.yml .github/workflows/integration.yml .github/workflows/pr.yml .github/workflows/push.yml .github/workflows/readme.md .gitignore .prettierignore .prettierrc.json .prettierrc.yml .releaserc.yml Makefile README.md __tests__/main.test.ts action.yml dist/index.js jest.config.js package.json src/ChangedFiles.ts src/File.ts src/FilesHelper.ts src/GithubHelper.ts src/InputHelper.ts src/UtilsHelper.ts src/main.ts src/tests/FilesHelper.test.ts src/tests/GithubHelper.test.ts src/tests/InputHelper.test.ts src/tests/UtilsHelper.test.ts src/tests/main.test.ts src/tests/mocks/core/index.test.ts src/tests/mocks/core/index.ts src/tests/mocks/env/events/issue_comment_created.json src/tests/mocks/env/events/issue_comment_edited.json src/tests/mocks/env/events/pull_request_opened.json src/tests/mocks/env/events/pull_request_reopened.json src/tests/mocks/env/events/pull_request_synchronize.json src/tests/mocks/env/events/push.json src/tests/mocks/env/events/push_merge.json src/tests/mocks/env/events/schedule.json src/tests/mocks/env/index.test.ts src/tests/mocks/env/index.ts src/tests/mocks/fs/index.test.ts src/tests/mocks/fs/index.ts src/tests/mocks/github/index.test.ts src/tests/mocks/github/index.ts src/tests/mocks/octokit/endpoint/merge.test.ts src/tests/mocks/octokit/endpoint/merge.ts src/tests/mocks/octokit/index.test.ts src/tests/mocks/octokit/index.ts src/tests/mocks/octokit/paginate.test.ts src/tests/mocks/octokit/paginate.ts src/tests/mocks/octokit/payloads.ts src/tests/mocks/octokit/pulls/listFiles.test.ts src/tests/mocks/octokit/pulls/listFiles.ts src/tests/mocks/octokit/repos/compareCommits.test.ts src/tests/mocks/octokit/repos/compareCommits.ts src/tests/payloads.ts src/typings/ActionError/index.d.ts src/typings/ChangedFiles/index.d.ts src/typings/CoreMock/index.d.ts src/typings/FsMock/index.d.ts src/typings/GitHubFile/index.d.ts src/typings/GitHubMock/index.d.ts src/typings/Inferred/index.d.ts src/typings/Inputs/index.d.ts src/typings/OctokitMock/index.d.ts src/typings/TestInput/index.d.ts tsconfig.build.json tsconfig.json yarn.lock

View File

@ -1,75 +0,0 @@
[
".codecov.yml",
".eslintignore",
".eslintrc.json",
".eslintrc.yml",
".github/workflows/integration.yml",
".github/workflows/pr.yml",
".github/workflows/push.yml",
".github/workflows/readme.md",
".gitignore",
".prettierignore",
".prettierrc.json",
".prettierrc.yml",
".releaserc.yml",
"Makefile",
"README.md",
"__tests__/main.test.ts",
"action.yml",
"dist/index.js",
"jest.config.js",
"package.json",
"src/ChangedFiles.ts",
"src/File.ts",
"src/FilesHelper.ts",
"src/GithubHelper.ts",
"src/InputHelper.ts",
"src/UtilsHelper.ts",
"src/main.ts",
"src/tests/FilesHelper.test.ts",
"src/tests/GithubHelper.test.ts",
"src/tests/InputHelper.test.ts",
"src/tests/UtilsHelper.test.ts",
"src/tests/main.test.ts",
"src/tests/mocks/core/index.test.ts",
"src/tests/mocks/core/index.ts",
"src/tests/mocks/env/events/issue_comment_created.json",
"src/tests/mocks/env/events/issue_comment_edited.json",
"src/tests/mocks/env/events/pull_request_opened.json",
"src/tests/mocks/env/events/pull_request_reopened.json",
"src/tests/mocks/env/events/pull_request_synchronize.json",
"src/tests/mocks/env/events/push.json",
"src/tests/mocks/env/events/push_merge.json",
"src/tests/mocks/env/events/schedule.json",
"src/tests/mocks/env/index.test.ts",
"src/tests/mocks/env/index.ts",
"src/tests/mocks/fs/index.test.ts",
"src/tests/mocks/fs/index.ts",
"src/tests/mocks/github/index.test.ts",
"src/tests/mocks/github/index.ts",
"src/tests/mocks/octokit/endpoint/merge.test.ts",
"src/tests/mocks/octokit/endpoint/merge.ts",
"src/tests/mocks/octokit/index.test.ts",
"src/tests/mocks/octokit/index.ts",
"src/tests/mocks/octokit/paginate.test.ts",
"src/tests/mocks/octokit/paginate.ts",
"src/tests/mocks/octokit/payloads.ts",
"src/tests/mocks/octokit/pulls/listFiles.test.ts",
"src/tests/mocks/octokit/pulls/listFiles.ts",
"src/tests/mocks/octokit/repos/compareCommits.test.ts",
"src/tests/mocks/octokit/repos/compareCommits.ts",
"src/tests/payloads.ts",
"src/typings/ActionError/index.d.ts",
"src/typings/ChangedFiles/index.d.ts",
"src/typings/CoreMock/index.d.ts",
"src/typings/FsMock/index.d.ts",
"src/typings/GitHubFile/index.d.ts",
"src/typings/GitHubMock/index.d.ts",
"src/typings/Inferred/index.d.ts",
"src/typings/Inputs/index.d.ts",
"src/typings/OctokitMock/index.d.ts",
"src/typings/TestInput/index.d.ts",
"tsconfig.build.json",
"tsconfig.json",
"yarn.lock"
]

View File

@ -1 +0,0 @@
.codecov.yml .eslintignore .eslintrc.json .eslintrc.yml .github/workflows/integration.yml .github/workflows/pr.yml .github/workflows/push.yml .github/workflows/readme.md .gitignore .prettierignore .prettierrc.json .prettierrc.yml .releaserc.yml Makefile README.md __tests__/main.test.ts action.yml dist/index.js jest.config.js package.json src/ChangedFiles.ts src/File.ts src/FilesHelper.ts src/GithubHelper.ts src/InputHelper.ts src/UtilsHelper.ts src/main.ts src/tests/FilesHelper.test.ts src/tests/GithubHelper.test.ts src/tests/InputHelper.test.ts src/tests/UtilsHelper.test.ts src/tests/main.test.ts src/tests/mocks/core/index.test.ts src/tests/mocks/core/index.ts src/tests/mocks/env/events/issue_comment_created.json src/tests/mocks/env/events/issue_comment_edited.json src/tests/mocks/env/events/pull_request_opened.json src/tests/mocks/env/events/pull_request_reopened.json src/tests/mocks/env/events/pull_request_synchronize.json src/tests/mocks/env/events/push.json src/tests/mocks/env/events/push_merge.json src/tests/mocks/env/events/schedule.json src/tests/mocks/env/index.test.ts src/tests/mocks/env/index.ts src/tests/mocks/fs/index.test.ts src/tests/mocks/fs/index.ts src/tests/mocks/github/index.test.ts src/tests/mocks/github/index.ts src/tests/mocks/octokit/endpoint/merge.test.ts src/tests/mocks/octokit/endpoint/merge.ts src/tests/mocks/octokit/index.test.ts src/tests/mocks/octokit/index.ts src/tests/mocks/octokit/paginate.test.ts src/tests/mocks/octokit/paginate.ts src/tests/mocks/octokit/payloads.ts src/tests/mocks/octokit/pulls/listFiles.test.ts src/tests/mocks/octokit/pulls/listFiles.ts src/tests/mocks/octokit/repos/compareCommits.test.ts src/tests/mocks/octokit/repos/compareCommits.ts src/tests/payloads.ts src/typings/ActionError/index.d.ts src/typings/ChangedFiles/index.d.ts src/typings/CoreMock/index.d.ts src/typings/FsMock/index.d.ts src/typings/GitHubFile/index.d.ts src/typings/GitHubMock/index.d.ts src/typings/Inferred/index.d.ts src/typings/Inputs/index.d.ts src/typings/OctokitMock/index.d.ts src/typings/TestInput/index.d.ts tsconfig.build.json tsconfig.json yarn.lock

View File

@ -1 +0,0 @@
.codecov.yml,.eslintrc.yml,.prettierrc.yml,.releaserc.yml,src/FilesHelper.ts,src/GithubHelper.ts,src/InputHelper.ts,src/UtilsHelper.ts,src/tests/FilesHelper.test.ts,src/tests/GithubHelper.test.ts,src/tests/InputHelper.test.ts,src/tests/UtilsHelper.test.ts,src/tests/main.test.ts,src/tests/mocks/core/index.test.ts,src/tests/mocks/core/index.ts,src/tests/mocks/env/events/issue_comment_created.json,src/tests/mocks/env/events/issue_comment_edited.json,src/tests/mocks/env/events/pull_request_opened.json,src/tests/mocks/env/events/pull_request_reopened.json,src/tests/mocks/env/events/pull_request_synchronize.json,src/tests/mocks/env/events/push.json,src/tests/mocks/env/events/push_merge.json,src/tests/mocks/env/events/schedule.json,src/tests/mocks/env/index.test.ts,src/tests/mocks/env/index.ts,src/tests/mocks/fs/index.test.ts,src/tests/mocks/fs/index.ts,src/tests/mocks/github/index.test.ts,src/tests/mocks/github/index.ts,src/tests/mocks/octokit/endpoint/merge.test.ts,src/tests/mocks/octokit/endpoint/merge.ts,src/tests/mocks/octokit/index.test.ts,src/tests/mocks/octokit/index.ts,src/tests/mocks/octokit/paginate.test.ts,src/tests/mocks/octokit/paginate.ts,src/tests/mocks/octokit/payloads.ts,src/tests/mocks/octokit/pulls/listFiles.test.ts,src/tests/mocks/octokit/pulls/listFiles.ts,src/tests/mocks/octokit/repos/compareCommits.test.ts,src/tests/mocks/octokit/repos/compareCommits.ts,src/tests/payloads.ts,src/typings/ActionError/index.d.ts,src/typings/ChangedFiles/index.d.ts,src/typings/CoreMock/index.d.ts,src/typings/FsMock/index.d.ts,src/typings/GitHubFile/index.d.ts,src/typings/GitHubMock/index.d.ts,src/typings/Inferred/index.d.ts,src/typings/Inputs/index.d.ts,src/typings/OctokitMock/index.d.ts,src/typings/TestInput/index.d.ts,tsconfig.build.json
1 .codecov.yml .eslintrc.yml .prettierrc.yml .releaserc.yml src/FilesHelper.ts src/GithubHelper.ts src/InputHelper.ts src/UtilsHelper.ts src/tests/FilesHelper.test.ts src/tests/GithubHelper.test.ts src/tests/InputHelper.test.ts src/tests/UtilsHelper.test.ts src/tests/main.test.ts src/tests/mocks/core/index.test.ts src/tests/mocks/core/index.ts src/tests/mocks/env/events/issue_comment_created.json src/tests/mocks/env/events/issue_comment_edited.json src/tests/mocks/env/events/pull_request_opened.json src/tests/mocks/env/events/pull_request_reopened.json src/tests/mocks/env/events/pull_request_synchronize.json src/tests/mocks/env/events/push.json src/tests/mocks/env/events/push_merge.json src/tests/mocks/env/events/schedule.json src/tests/mocks/env/index.test.ts src/tests/mocks/env/index.ts src/tests/mocks/fs/index.test.ts src/tests/mocks/fs/index.ts src/tests/mocks/github/index.test.ts src/tests/mocks/github/index.ts src/tests/mocks/octokit/endpoint/merge.test.ts src/tests/mocks/octokit/endpoint/merge.ts src/tests/mocks/octokit/index.test.ts src/tests/mocks/octokit/index.ts src/tests/mocks/octokit/paginate.test.ts src/tests/mocks/octokit/paginate.ts src/tests/mocks/octokit/payloads.ts src/tests/mocks/octokit/pulls/listFiles.test.ts src/tests/mocks/octokit/pulls/listFiles.ts src/tests/mocks/octokit/repos/compareCommits.test.ts src/tests/mocks/octokit/repos/compareCommits.ts src/tests/payloads.ts src/typings/ActionError/index.d.ts src/typings/ChangedFiles/index.d.ts src/typings/CoreMock/index.d.ts src/typings/FsMock/index.d.ts src/typings/GitHubFile/index.d.ts src/typings/GitHubMock/index.d.ts src/typings/Inferred/index.d.ts src/typings/Inputs/index.d.ts src/typings/OctokitMock/index.d.ts src/typings/TestInput/index.d.ts tsconfig.build.json

View File

@ -1,54 +0,0 @@
[
".codecov.yml",
".eslintrc.yml",
".prettierrc.yml",
".releaserc.yml",
"src/FilesHelper.ts",
"src/GithubHelper.ts",
"src/InputHelper.ts",
"src/UtilsHelper.ts",
"src/tests/FilesHelper.test.ts",
"src/tests/GithubHelper.test.ts",
"src/tests/InputHelper.test.ts",
"src/tests/UtilsHelper.test.ts",
"src/tests/main.test.ts",
"src/tests/mocks/core/index.test.ts",
"src/tests/mocks/core/index.ts",
"src/tests/mocks/env/events/issue_comment_created.json",
"src/tests/mocks/env/events/issue_comment_edited.json",
"src/tests/mocks/env/events/pull_request_opened.json",
"src/tests/mocks/env/events/pull_request_reopened.json",
"src/tests/mocks/env/events/pull_request_synchronize.json",
"src/tests/mocks/env/events/push.json",
"src/tests/mocks/env/events/push_merge.json",
"src/tests/mocks/env/events/schedule.json",
"src/tests/mocks/env/index.test.ts",
"src/tests/mocks/env/index.ts",
"src/tests/mocks/fs/index.test.ts",
"src/tests/mocks/fs/index.ts",
"src/tests/mocks/github/index.test.ts",
"src/tests/mocks/github/index.ts",
"src/tests/mocks/octokit/endpoint/merge.test.ts",
"src/tests/mocks/octokit/endpoint/merge.ts",
"src/tests/mocks/octokit/index.test.ts",
"src/tests/mocks/octokit/index.ts",
"src/tests/mocks/octokit/paginate.test.ts",
"src/tests/mocks/octokit/paginate.ts",
"src/tests/mocks/octokit/payloads.ts",
"src/tests/mocks/octokit/pulls/listFiles.test.ts",
"src/tests/mocks/octokit/pulls/listFiles.ts",
"src/tests/mocks/octokit/repos/compareCommits.test.ts",
"src/tests/mocks/octokit/repos/compareCommits.ts",
"src/tests/payloads.ts",
"src/typings/ActionError/index.d.ts",
"src/typings/ChangedFiles/index.d.ts",
"src/typings/CoreMock/index.d.ts",
"src/typings/FsMock/index.d.ts",
"src/typings/GitHubFile/index.d.ts",
"src/typings/GitHubMock/index.d.ts",
"src/typings/Inferred/index.d.ts",
"src/typings/Inputs/index.d.ts",
"src/typings/OctokitMock/index.d.ts",
"src/typings/TestInput/index.d.ts",
"tsconfig.build.json"
]

View File

@ -1 +0,0 @@
.codecov.yml .eslintrc.yml .prettierrc.yml .releaserc.yml src/FilesHelper.ts src/GithubHelper.ts src/InputHelper.ts src/UtilsHelper.ts src/tests/FilesHelper.test.ts src/tests/GithubHelper.test.ts src/tests/InputHelper.test.ts src/tests/UtilsHelper.test.ts src/tests/main.test.ts src/tests/mocks/core/index.test.ts src/tests/mocks/core/index.ts src/tests/mocks/env/events/issue_comment_created.json src/tests/mocks/env/events/issue_comment_edited.json src/tests/mocks/env/events/pull_request_opened.json src/tests/mocks/env/events/pull_request_reopened.json src/tests/mocks/env/events/pull_request_synchronize.json src/tests/mocks/env/events/push.json src/tests/mocks/env/events/push_merge.json src/tests/mocks/env/events/schedule.json src/tests/mocks/env/index.test.ts src/tests/mocks/env/index.ts src/tests/mocks/fs/index.test.ts src/tests/mocks/fs/index.ts src/tests/mocks/github/index.test.ts src/tests/mocks/github/index.ts src/tests/mocks/octokit/endpoint/merge.test.ts src/tests/mocks/octokit/endpoint/merge.ts src/tests/mocks/octokit/index.test.ts src/tests/mocks/octokit/index.ts src/tests/mocks/octokit/paginate.test.ts src/tests/mocks/octokit/paginate.ts src/tests/mocks/octokit/payloads.ts src/tests/mocks/octokit/pulls/listFiles.test.ts src/tests/mocks/octokit/pulls/listFiles.ts src/tests/mocks/octokit/repos/compareCommits.test.ts src/tests/mocks/octokit/repos/compareCommits.ts src/tests/payloads.ts src/typings/ActionError/index.d.ts src/typings/ChangedFiles/index.d.ts src/typings/CoreMock/index.d.ts src/typings/FsMock/index.d.ts src/typings/GitHubFile/index.d.ts src/typings/GitHubMock/index.d.ts src/typings/Inferred/index.d.ts src/typings/Inputs/index.d.ts src/typings/OctokitMock/index.d.ts src/typings/TestInput/index.d.ts tsconfig.build.json

View File

@ -1 +0,0 @@
.github/workflows/integration.yml,.github/workflows/pr.yml,.github/workflows/push.yml,.github/workflows/readme.md,.gitignore,.prettierignore,README.md,action.yml,jest.config.js,package.json,src/main.ts,tsconfig.json,yarn.lock
1 .github/workflows/integration.yml .github/workflows/pr.yml .github/workflows/push.yml .github/workflows/readme.md .gitignore .prettierignore README.md action.yml jest.config.js package.json src/main.ts tsconfig.json yarn.lock

View File

@ -1,15 +0,0 @@
[
".github/workflows/integration.yml",
".github/workflows/pr.yml",
".github/workflows/push.yml",
".github/workflows/readme.md",
".gitignore",
".prettierignore",
"README.md",
"action.yml",
"jest.config.js",
"package.json",
"src/main.ts",
"tsconfig.json",
"yarn.lock"
]

View File

@ -1 +0,0 @@
.github/workflows/integration.yml .github/workflows/pr.yml .github/workflows/push.yml .github/workflows/readme.md .gitignore .prettierignore README.md action.yml jest.config.js package.json src/main.ts tsconfig.json yarn.lock

View File

@ -1 +0,0 @@
.eslintignore,.eslintrc.json,.prettierrc.json,Makefile,__tests__/main.test.ts,dist/index.js,src/ChangedFiles.ts,src/File.ts
1 .eslintignore .eslintrc.json .prettierrc.json Makefile __tests__/main.test.ts dist/index.js src/ChangedFiles.ts src/File.ts

View File

@ -1,10 +0,0 @@
[
".eslintignore",
".eslintrc.json",
".prettierrc.json",
"Makefile",
"__tests__/main.test.ts",
"dist/index.js",
"src/ChangedFiles.ts",
"src/File.ts"
]

View File

@ -1 +0,0 @@
.eslintignore .eslintrc.json .prettierrc.json Makefile __tests__/main.test.ts dist/index.js src/ChangedFiles.ts src/File.ts

View File

@ -1,157 +0,0 @@
json_output='["functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda.json", "functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json", "functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json", "functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json"]'
csv_output="functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda.json,functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json,functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json,functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json"
txt_hard_output='functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda.json_<br />&nbsp;&nbsp;_functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json_<br />&nbsp;&nbsp;_functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json_<br />&nbsp;&nbsp;_functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json'
txt_output='functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda.json functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json functions/twitch-sadako/webhookSubscribeLambda/test/webhookSubscribeLambda_post.json'
testOutput () {
# read from var
if [ "${2}" == "json" ]; then
local output_length=$(echo "${1}" | jq '. | length')
elif [ "${2}" == "," ]; then
local output_length=$(awk -F"${2}" '{print NF-1}' <<< $(echo "${1}"))
else
local output_length=$(awk -F"${2}" '{print NF-1}' <<< $(echo "${1}"))
fi
echo "$output_length"
}
testFile () {
# read from file
if [ "${2}" == "json" ]; then
local file_length=$(jq -r '. | length' ${file}.json)
elif [ "${2}" == "," ]; then
local file_length=$(cat ${file}.csv | awk -F"${2}" '{print NF-1}')
else
local file_length=$(cat ${file}.txt | awk -F"${2}" '{print NF-1}')
fi
echo "$file_length"
}
cleanTest () {
rm -rf $1.json $1.csv $1.txt
}
prepareTest () {
# if prefix is simple setup test var and file
if [ "$1" == "simple_" ]; then
# declare a var named simple_FILE
if [ "$dev" == "dev" ]; then
local file_prefix="events/"
else
local file_prefix=""
fi
declare -n file=${1}${2}
if [ "$3" == "json" ]; then
echo ${json_output} > "${file_prefix}${!file}.json"
elif [ "$3" == "," ]; then
echo ${csv_output} > "${file_prefix}${!file}.csv"
elif [ "$3" == "_<br />&nbsp;&nbsp;_" ]; then
echo ${txt_hard_output} > "${file_prefix}${!file}.txt"
else
echo ${txt_output} > "${file_prefix}${!file}.txt"
fi
if [ "$4" == "json" ]; then
file=$json_output
elif [ "$4" == "," ]; then
file=$csv_output
elif [ "$4" == "_<br />&nbsp;&nbsp;_" ]; then
file=$txt_hard_output
else
file=$txt_output
fi
else
declare -n file=${2}
if [ "$dev" == "dev" ]; then
if [ "$4" == "json" ]; then
file="$(cat events/${!file}.json)"
elif [ "$4" == "," ]; then
file="$(cat events/${!file}.csv)"
else
file="$(cat events/${!file}.txt)"
fi
fi
fi
echo "${file}"
}
testResults () {
if [ "$1" == 'simple_' ]; then
expected=3
if [ "$2" == 'json' ]; then
expected=$(($expected+1))
fi
# echo $result
if [ "$3" != "$expected" ]; then
echo -e "\t\033[1;91mTest failure $5/($1)$4:'$2' { EXPECTED:$expected RECEIVED:$3 } \033[0m"
exit 1;
fi
else
if [ "$4" == 'files' ]; then
expected=72
elif [ "$4" == 'files_added' ]; then
expected=51
elif [ "$4" == 'files_modified' ]; then
expected=12
elif [ "$4" == 'files_removed' ]; then
expected=7
fi
if [ "$2" == 'json' ]; then
expected=$(($expected+1))
fi
if [ "$3" != "$expected" ]; then
echo -e "\t\033[1;91mTest failure $5/($1)$4:'$2' { EXPECTED:$expected RECEIVED:$3 } \033[0m"
exit 1;
fi
fi
echo -e "\t\033[1;92mTest success $5/($1)$4:'$2' { $expected == $3 } \033[0m"
}
runTest () {
for prefix in "simple_" "real"; do \
file=${1}
if [ "$prefix" == 'simple_' ]; then
if [ "$dev" == "dev" ]; then
file=events/${prefix}${1}
else
file=${prefix}${1}
fi
elif [ "$prefix" != 'simple_' ] && [ "$dev" == "dev" ]; then
file=events/${1}
fi
input="$(prepareTest $prefix $1 "$2" "$3")"
local file_length=$(testFile $file "${2}")
local output_length=$(testOutput "${input}" "${3}")
testResults $prefix "${2}" "$file_length" "$1" "fileOutput"
testResults $prefix "${3}" "$output_length" "$1" "output"
if [ "$prefix" == 'simple_' ]; then
cleanTest $file
fi
done
}
test () {
if [ "$dev" == "dev" ]; then
echo -e "\t\033[1;91mDEV MODE\033[0m"
fi
if [ "$output" == "" ] || [ "$fileOutput" == "" ]; then
for fileOutput in "json" "," " "; do \
echo -e "\033[1;92mFILEOUTPUT:'$fileOutput'\033[0m"
for output in "json" "," " "; do \
echo -e "\033[1;92mOUTPUT:'$output'\033[0m"
for file in "files" "files_modified" "files_added" "files_removed"; do \
echo -e "\033[1;92mFILE:'$file'\033[0m"
runTest $file "$fileOutput" "$output"
done
done
done
else
for file in "files" "files_modified" "files_added" "files_removed"; do \
echo -e "\033[1;92mFILE:'$file' with FILEOUTPUT:'$fileOutput' OUTPUT:'$output'\033[0m"
runTest $file "$fileOutput" "$output"
done
fi
}
dev=$1
test

View File

@ -1,26 +0,0 @@
# Set to true to add reviewers to pull requests
addReviewers: true
# Set to true to add assignees to pull requests
addAssignees: author
# A list of reviewers to be added to pull requests (GitHub user name)
reviewers:
- trilom
# A number of reviewers added to the pull request
# Set 0 to add all the reviewers (default: 0)
numberOfReviewers: 0
# A list of assignees, overrides reviewers if set
# assignees:
# - assigneeA
# A number of assignees to add to the pull request
# Set to 0 to add all of the assignees.
# Uses numberOfReviewers if unset.
# numberOfAssignees: 2
# A list of keywords to be skipped the process that add reviewers if pull requests include it
# skipKeywords:
# - wip

View File

@ -1,42 +0,0 @@
- name: pretty
description: Code that has been linted with eslint and prettier
color: 76edd1
- name: builds
description: Code that builds with yarn and tsc
color: 39bc44
- name: tested-unit
description: Code that has passed unit tests with jest
color: 9520bc
- name: tested-integration
description: Code that has passed integration tests with jest
color: fc5aee
- name: "doesnt read directions"
description: "Doesn't know how to read directions, please PR to develop"
color: d876e3
- name: automated pr
description: This was created by create-pull-request action
color: b9ff9b
- name: released
description: This has been released to NPM, Github Packages, and Actions Marketplace
color: ededed
- name: bug
description: Something isn't working
color: d73a4a
- name: duplicate
description: This issue or pull request already exists
color: cfd3d7
- name: enhancement
description: New feature or request
color: a2eeef
- name: "automated merge"
description: This was merged automatically
color: c2e0c6
- name: "hold merge"
description: This merge will be blocked from automerging until this label is removed
color: b60205
- name: lintdogged
description: Code that has been looked at by reviewdog
color: 5F422D
- name: failure
description: Something bad happened...
color: d93f0b

View File

@ -1,97 +0,0 @@
# this will tag PRs that are ready for release and automerge them
name: Automerge Pull Requests
on:
# issue_comment:
# types: [created]
pull_request:
branches: [master, next, alpha, beta]
types: [labeled, closed]
jobs:
automerge:
name: automerge pr
runs-on: ubuntu-latest
env:
GITHUB_TOKEN: ${{ secrets.TRILOM_BOT_TOKEN }}
pr_number: ${{ format('{0}{1}', github.event.pull_request.number, github.event.issue.number) }}
# if event type is non fork PR or comment on PR from trilom with '/release'
if: >-
(
github.event_name == 'pull_request'
&& github.event.pull_request.head.repo.full_name == github.repository
&& contains(github.event.pull_request.labels.*.name, 'pretty')
&& contains(github.event.pull_request.labels.*.name, 'builds')
&& contains(github.event.pull_request.labels.*.name, 'tested-unit')
&& contains(github.event.pull_request.labels.*.name, 'tested-integration')
&& contains(github.event.pull_request.labels.*.name, 'lintdogged')
&& ! contains(github.event.pull_request.labels.*.name, 'automated merge')
&& ! contains(github.event.pull_request.labels.*.name, 'hold merge')
) || (
github.event_name == 'issue_comment'
&& github.event.issue.pull_request != ''
&& contains(github.event.comment.body, '/release')
&& github.actor == 'trilom'
&& contains(github.event.issue.labels.*.name, 'pretty')
&& contains(github.event.issue.labels.*.name, 'builds')
&& contains(github.event.issue.labels.*.name, 'tested-unit')
&& contains(github.event.issue.labels.*.name, 'tested-integration')
&& contains(github.event.issue.labels.*.name, 'lintdogged')
&& ! contains(github.event.issue.labels.*.name, 'automated merge')
&& ! contains(github.event.issue.labels.*.name, 'hold merge'))
steps:
- name: if pretty, builds, tested merge automerge pr
# if pretty, builds, and tested labels then merge
uses: pascalgn/automerge-action@v0.7.5
env:
GITHUB_TOKEN: ${{ env.GITHUB_TOKEN }}
MERGE_METHOD: merge
# this breaks the /release on issue_comment portion unless I get the head.ref from github-script
MERGE_COMMIT_MESSAGE: 'Auto merge from ${{ github.event.pull_request.head.ref }} PR#{pullRequest.number}: {pullRequest.title}'
UPDATE_METHOD: merge
MERGE_LABELS: 'pretty,builds,tested-unit,tested-integration,lintdogged'
UPDATE_LABELS: ''
# if failure, get payload of PR and notify
- name: if failure, get pr payload
uses: actions/github-script@0.8.0
id: pr_json
if: failure()
with:
github-token: ${{env.GITHUB_TOKEN}}
script: |
const result = await github.pulls.get({
owner: '${{ github.repository }}'.split('/')[0],
repo: '${{ github.repository }}'.split('/')[1],
pull_number: ${{ env.pr_number }}
})
return result.data;
- name: if failure, set pr payload outputs
if: failure()
id: pr
run: |
echo '${{ steps.pr_json.outputs.result }}' > pr.json
echo "::set-output name=user::$( jq -r '.user.login' pr.json )"
echo "::set-output name=head::$( jq -r '.head.repo.full_name' pr.json )"
echo "::set-output name=head_url::$( jq -r '.head.repo.html_url' pr.json )"
echo "::set-output name=base::$( jq -r '.base.repo.full_name' pr.json )"
echo "::set-output name=base_url::$( jq -r '.base.repo.html_url' pr.json )"
- name: if failure, notify
uses: peter-evans/create-or-update-comment@v1
if: failure()
with:
token: ${{ env.GITHUB_TOKEN }}
issue-number: ${{ env.pr_number }}
body: |
@${{ steps.pr.outputs.user }}, @trilom - it appears that there was an issue with the merge.
Head Repo/Branch: **[${{ steps.pr.outputs.head }}]**(${{ steps.pr.outputs.head_url }}) merge into **[${{ steps.pr.outputs.base }}]**(${{ steps.pr.outputs.base_url }})
## Event JSON
```json
${{ toJSON(steps.pr_json.outputs.result)}}
```
- uses: actions/github-script@0.6.0
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
github.issues.addLabels({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
labels: ['automated merge']
})

View File

@ -1,19 +0,0 @@
name: Close Pull Request
on:
pull_request:
branches-ignore: [master]
types: [opened, reopened]
jobs:
# close any fork PRs not opened by trilom to anything but master
close_pr:
name: close non master PRs from fork
runs-on: ubuntu-latest
if: (github.actor != 'trilom' || github.actor != 'trilom-bot') && github.event.pull_request.head.repo.full_name != github.repository
steps:
- uses: superbrothers/close-pull-request@v2
with:
comment: Please merge your code into master, this will trigger the desired merge workflow.
- uses: actions/github@v1.0.0
if: success()
with:
args: label "doesnt read directions"

View File

@ -1,106 +0,0 @@
name: Integration Tests
on:
issue_comment:
types:
- created
schedule:
- cron: '0 0 * * *'
pull_request:
branches: [master]
push:
branches: [master]
jobs:
# always_job:
# name: Always run job
# runs-on: ubuntu-latest
# steps:
# - name: dump env
# env:
# GITHUB_CONTEXT: ${{ toJson(github) }}
# JOB_CONTEXT: ${{ toJson(job) }}
# STEPS_CONTEXT: ${{ toJson(steps) }}
# RUNNER_CONTEXT: ${{ toJson(runner) }}
# STRATEGY_CONTEXT: ${{ toJson(strategy) }}
# MATRIX_CONTEXT: ${{ toJson(matrix) }}
# run: |
# echo "GITHUB_EVENT_PATH\n$GITHUB_EVENT_PATH"
# echo "GITHUB_CONTEXT\n$GITHUB_CONTEXT"
# echo "JOB_CONTEXT\n$JOB_CONTEXT"
# echo "STEPS_CONTEXT\n$STEPS_CONTEXT"
# echo "RUNNER_CONTEXT\n$RUNNER_CONTEXT"
# echo "STRATEGY_CONTEXT\n$STRATEGY_CONTEXT"
# echo "MATRIX_CONTEXT\n$MATRIX_CONTEXT"
integration:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
event_type: ['push', 'pull_request']
output: ['json', ',', ' ', '_<br />&nbsp;&nbsp;_']
fileOutput: ['json', ',', ' ', '_<br />&nbsp;&nbsp;_']
if: >-
( startsWith(github.head_ref, '1.')
|| startsWith(github.head_ref, '2.'))
||
contains(github.event.head_commit.message, 'Release merge from')
||
github.event_name == 'schedule'
|| (
github.event_name == 'issue_comment'
&& github.event.issue.number != ''
&& contains(github.event.comment.body, '/integration')
&& github.actor == 'trilom')
steps:
# get pr number if exists
- id: pr
if: github.event_name == 'issue_comment'
run: |
pr=$(echo "${{github.event.comment.body}}" | sed 's|.*/integration||') &&
echo "::set-output name=pr::${pr}"
env:
comment: ${{ toJson(github) }}
# use pr number from integration command
- uses: actions/checkout@v2
if: github.event_name == 'issue_comment' && steps.pr.outputs.pr != ''
with:
ref: ${{format('refs/pull/{0}/head', steps.pr.outputs.pr )}}
# use the issue number if pr is blank
- uses: actions/checkout@v2
if: github.event_name == 'issue_comment' && steps.pr.outputs.pr == '' && github.event.issue.pull_request != ''
with:
ref: ${{format('refs/pull/{0}/head', github.event.issue.number )}}
- name: fail if no PR number and issue comment
if: github.event_name == 'issue_comment' && steps.pr.outputs.pr == '' && github.event.issue.pull_request == ''
run: |
echo "Please provide a PR number to use like /integration13 for PR# 13."
exit 1
- uses: actions/checkout@v2
if: github.event_name != 'issue_comment'
- run: yarn build-package
- uses: ./
id: file_changes_build_pr
if: matrix.event_type == 'pull_request'
with:
prNumber: 83
output: ${{ matrix.output }}
fileOutput: ${{ matrix.fileOutput }}
- uses: ./
id: file_changes_build_push
if: matrix.event_type == 'push'
with:
pushBefore: 6ac7697cd1c4f23a08d4d4edbe7dab06b34c58a2
pushAfter: 4ee1a1a2515f4ac1b90a56aaeb060b97f20c8968
output: ${{ matrix.output }}
fileOutput: ${{ matrix.fileOutput }}
- run: |
mv $HOME/files* .
chmod +x test.sh && ./test.sh
working-directory: .github/actions/integration
if: success()
env:
fileOutput: ${{ matrix.fileOutput }}
output: ${{ matrix.output }}
files: ${{ format('{0}{1}', steps.file_changes_build_pr.outputs.files, steps.file_changes_build_push.outputs.files ) }}
files_modified: ${{ format('{0}{1}', steps.file_changes_build_pr.outputs.files_modified, steps.file_changes_build_push.outputs.files_modified ) }}
files_added: ${{ format('{0}{1}', steps.file_changes_build_pr.outputs.files_added, steps.file_changes_build_push.outputs.files_added ) }}
files_removed: ${{ format('{0}{1}', steps.file_changes_build_pr.outputs.files_removed, steps.file_changes_build_push.outputs.files_removed ) }}

View File

@ -1,13 +0,0 @@
name: Sync labels
on:
push:
branches: [master]
paths: [.github/labels.yml]
jobs:
make-labels:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: micnncim/action-label-syncer@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@ -1,192 +0,0 @@
name: Contribution Workflow
env:
isFork: ${{ github.event.pull_request.head.repo.full_name != github.repository }}
on: [pull_request]
jobs:
add-reviews:
runs-on: ubuntu-latest
steps:
- uses: kentaro-m/auto-assign-action@v1.1.0
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
# make sure we can build
build:
name: yarn install && tsc
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- run: yarn build
- uses: actions/github-script@0.6.0
if: failure() && contains(env.isFork, 'false')
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
if ('${{ contains(github.event.pull_request.labels.*.name, 'builds') }}' == 'true') {
github.issues.removeLabel({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
name: 'builds'
})
}
- uses: actions/github-script@0.6.0
if: contains(env.isFork, 'false')
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
github.issues.addLabels({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
labels: ['builds']
})
# unit test with jest
test-unit:
name: jest unit tests
runs-on: ubuntu-latest
needs: build
steps:
- uses: actions/checkout@v2
- run: yarn build
- run: yarn test-coverage
- run: bash <(curl -s https://codecov.io/bash)
if: contains(env.isFork, 'false')
- uses: actions/github-script@0.6.0
if: failure() && contains(env.isFork, 'false')
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
if ('${{ contains(github.event.pull_request.labels.*.name, 'tested-unit') }}' == 'true') {
github.issues.removeLabel({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
name: 'tested-unit'
})
}
- uses: actions/github-script@0.6.0
if: contains(env.isFork, 'false')
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
github.issues.addLabels({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
labels: ['tested-unit']
})
# integration test with jest
test-integration:
name: jest integration tests
runs-on: ubuntu-latest
needs: test-unit
steps:
- uses: actions/checkout@v2
- run: yarn build
- run: yarn test-integration
env:
GITHUB_TOKEN: ${{ secrets.TRILOM_BOT_TOKEN }}
- uses: actions/github-script@0.6.0
if: failure() && contains(env.isFork, 'false')
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
if ('${{ contains(github.event.pull_request.labels.*.name, 'tested-integration') }}' == 'true') {
github.issues.removeLabel({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
name: 'tested-integration'
})
}
- uses: actions/github-script@0.6.0
if: contains(env.isFork, 'false')
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
github.issues.addLabels({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
labels: ['tested-integration']
})
# lint code in github check
lintdog-fork:
name: eslintdog (reviewdog)
runs-on: ubuntu-latest
needs: build
if: github.event.pull_request.head.repo.full_name != github.repository
steps:
- uses: actions/checkout@v2
- run: yarn build
- name: Lint and report
uses: reviewdog/action-eslint@v1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
reporter: github-check
eslint_flags: '--ext .ts ./'
# lint code and comment back if possible
lintdog:
name: eslintdog (reviewdog)
runs-on: ubuntu-latest
needs: build
if: github.event.pull_request.head.repo.full_name == github.repository
steps:
- uses: actions/checkout@v2
- run: yarn build
- name: Lint and report
uses: reviewdog/action-eslint@v1
with:
github_token: ${{ secrets.TRILOM_BOT_TOKEN }}
reporter: github-pr-review
eslint_flags: '--ext .ts ./'
- uses: actions/github-script@0.6.0
if: failure()
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
if ('${{ contains(github.event.pull_request.labels.*.name, 'lintdogged') }}' == 'true') {
github.issues.removeLabel({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
name: 'lintdogged'
})
}
- uses: actions/github-script@0.6.0
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
github.issues.addLabels({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
labels: ['lintdogged']
})
# format and push code back if not forked branch
format_check_push:
name: prettier
runs-on: ubuntu-latest
needs: [lintdog, lintdog-fork]
if: always()
env:
GITHUB_TOKEN: ${{ secrets.TRILOM_BOT_TOKEN }}
steps:
- uses: actions/checkout@v2 # checkout for forks
if: contains(env.isFork, 'true')
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- uses: actions/checkout@v2 # checkout for PR
if: contains(env.isFork, 'false')
with:
token: ${{ secrets.TRILOM_BOT_TOKEN }}
repository: ${{ github.event.pull_request.head.repo.full_name }}
ref: ${{ github.event.pull_request.head.ref }}
- run: yarn build
- run: yarn format-check
- name: yarn format and push code if check failed
if: failure() && github.actor != 'trilom-bot' && contains(env.isFork, 'false')
env:
GITHUB_TOKEN: ${{ secrets.TRILOM_BOT_TOKEN }}
run: |
yarn format
sudo yarn clean
git config --local user.email "trilom-bot@trailmix.me"
git config --local user.name "trilom-bot"
git add -A
git diff-index --quiet HEAD || git commit -m "Adding format changes 🤖" -a
git push https://x-access-token:${GITHUB_TOKEN}@github.com/${{ github.repository }}.git HEAD:refs/heads/${{ github.head_ref }} && exit 0
- uses: actions/github-script@0.6.0
if: failure() && contains(env.isFork, 'false')
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
if ('${{ contains(github.event.pull_request.labels.*.name, 'pretty') }}' == 'true') {
github.issues.removeLabel({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
name: 'pretty'
})
}
- uses: actions/github-script@0.6.0
if: contains(env.isFork, 'false')
with:
github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
script: |
github.issues.addLabels({owner: context.repo.owner, repo: context.repo.repo, issue_number: context.issue.number,
labels: ['pretty']
})

View File

@ -1,259 +0,0 @@
# if a push comes in then this will test it for a release and create a release PR if needed
name: Push to release branches
on:
push:
branches: [master, next, alpha, beta]
tags-ignore: ['**']
jobs:
# semantic release an auto-merged branch to github package repo, npm, github actions
release:
name: Release to NPM, Github, Github Actions Marketplace
runs-on: ubuntu-latest
needs: [build, test-unit, test-integration, lintdog]
if: >
github.actor != 'semantic-release-bot'
&& ( (contains(github.event.head_commit.message, 'trilom/1.')
|| contains(github.event.head_commit.message, 'trilom/2.'))
&& ! contains(github.event.head_commit.message, 'chore(release):'))
env:
GITHUB_TOKEN: ${{ secrets.TRILOM_BOT_TOKEN }}
SEMANTIC_RELEASE_PACKAGE: '@${{ github.repository }}'
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: semantic-release
uses: cycjimmy/semantic-release-action@v2
id: semantic
with:
semantic_version: 15.14.0
extra_plugins: |
@semantic-release/git@7.0.18
@semantic-release/changelog
semantic-release-slack-bot
dry_run: false
- name: echo release outputs
if: steps.semantic.outputs.new_release_published == 'true'
run: |
echo ${{ steps.semantic.outputs.new_release_version }}
echo ${{ steps.semantic.outputs.new_release_major_version }}
echo ${{ steps.semantic.outputs.new_release_minor_version }}
echo ${{ steps.semantic.outputs.new_release_patch_version }}
- name: Setup Node.js with GitHub Package Registry
if: steps.semantic.outputs.new_release_published == 'true'
uses: actions/setup-node@v1
with:
node-version: 12
registry-url: 'https://npm.pkg.github.com'
scope: trilom
- name: Publish To GitHub Package Registry
if: steps.semantic.outputs.new_release_published == 'true'
run: npm publish
env:
NODE_AUTH_TOKEN: ${{ env.GITHUB_TOKEN }}
# create PR from release branch to master to prepare for release
check-release:
name: Check if we need to release
runs-on: ubuntu-latest
needs: [build, test-unit, test-integration, lintdog]
if: >
github.actor != 'semantic-release-bot'
&& ! contains(github.event.head_commit.message, 'trilom/1.')
&& ! contains(github.event.head_commit.message, 'trilom/2.')
&& ! contains(github.event.head_commit.message, 'chore(release):')
env:
GITHUB_TOKEN: ${{ secrets.TRILOM_BOT_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: commit format changes and create authors file
run: |
git config --local user.email "trilom-bot@trailmix.me"
git config --local user.name "trilom-bot"
yarn build
yarn format
git add -A
git diff-index --quiet HEAD || git commit -m "Adding format changes 🤖" -a
yarn build-release
git add -A
git diff-index --quiet HEAD || git commit -m "Adding release changes ⚙️" -a
git log --format='%aN <%aE>%n%cN <%cE>' | sort -u > AUTHORS
sed -i '/trilom-bot/d' AUTHORS
sed -i '/semantic-release-bot/d' AUTHORS
sed -i '/carnoco@gmail.com/d' AUTHORS
sed -i '/GitHub <noreply@github.com>/d' AUTHORS
sed -i '/dependabot/d' AUTHORS
echo -e "\r\n$(date)" >> AUTHORS
git add -A
git diff-index --quiet HEAD || git commit -m "Updating AUTHORS 📓" -a
# see if we need to release, if so create a automerge release PR and notify the original creator
- name: semantic-release
uses: cycjimmy/semantic-release-action@v2
id: semantic
env:
SEMANTIC_RELEASE_PACKAGE: '@${{ github.repository }}'
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
with:
semantic_version: 15.14.0
extra_plugins: |
@semantic-release/git@7.0.18
@semantic-release/changelog
semantic-release-slack-bot
dry_run: true
- name: echo release outputs
if: steps.semantic.outputs.new_release_published == 'true'
run: |
echo ${{ steps.semantic.outputs.new_release_version }}
echo ${{ steps.semantic.outputs.new_release_major_version }}
echo ${{ steps.semantic.outputs.new_release_minor_version }}
echo ${{ steps.semantic.outputs.new_release_patch_version }}
- name: push potential formatting changes since there is no release
if: steps.semantic.outputs.new_release_published == 'false'
run: |
git config --local user.email "trilom-bot@trailmix.me"
git config --local user.name "trilom-bot"
git push -f https://x-access-token:${GITHUB_TOKEN}@github.com/${GITHUB_REPOSITORY}.git HEAD:${{ github.ref }}
- name: get changed files and format for automerge PR body
id: file_changes
uses: trilom/file-changes-action@master
if: steps.semantic.outputs.new_release_published == 'true'
with:
githubToken: ${{ env.GITHUB_TOKEN }}
output: '_<br />&nbsp;&nbsp;_'
- name: get original PR number
uses: actions/github-script@0.6.0
id: pr
if: steps.semantic.outputs.new_release_published == 'true'
with:
github-token: ${{env.GITHUB_TOKEN}}
result-encoding: string
script: |
const result = await github.repos.listPullRequestsAssociatedWithCommit({
owner: context.payload.repository.owner.name,
repo: context.payload.repository.name,
commit_sha: context.payload.head_commit.id
})
if (result.data.length >= 1) {
return result.data[0].number
} else return 87
- name: get original PR user
uses: actions/github-script@0.6.0
id: login
if: steps.pr.outputs.result != 0 && steps.semantic.outputs.new_release_published == 'true'
with:
github-token: ${{env.GITHUB_TOKEN}}
result-encoding: string
script: |
const result = await github.pulls.get({
owner: context.payload.repository.owner.name,
repo: context.payload.repository.name,
pull_number: ${{ steps.pr.outputs.result }}
})
if (result.data.user === true && result.data.user.login === true) {
return result.data.user.login
} else return 'trilom';
- name: create release PR
id: create-pr
uses: peter-evans/create-pull-request@v2
if: steps.semantic.outputs.new_release_published == 'true'
with:
token: ${{ env.GITHUB_TOKEN }}
commit-message: '${{ github.event.head_commit.message }}'
committer: trilom-bot <trilom-bot@trailmix.me>
author: ${{ steps.login.outputs.result }} <${{ steps.login.outputs.result }}@users.noreply.github.com>
title: 'releases/v${{ steps.semantic.outputs.new_release_version }} [@${{ steps.login.outputs.result }}] - ${{ github.event.head_commit.message }}'
body: |
# @${{ steps.login.outputs.result }} would like to merge into file-changes-action
[**compare link**](${{ github.event.compare }})
## Commits
```json
${{ toJSON(github.event.commits)}}
```
## Files
&nbsp;&nbsp;_${{ steps.file_changes.outputs.files}}_
## Files modified
&nbsp;&nbsp;_${{ steps.file_changes.outputs.files_modified}}_
## Files added
&nbsp;&nbsp;_${{ steps.file_changes.outputs.files_added}}_
## Files removed
&nbsp;&nbsp;_${{ steps.file_changes.outputs.files_removed}}_
labels: 'automated pr'
assignees: '${{ steps.login.outputs.result }},trilom'
reviewers: trilom
branch: '${{ steps.semantic.outputs.new_release_version }}'
- name: notify initial commiter of change
uses: peter-evans/create-or-update-comment@v1
if: steps.login.outputs.result != '' && steps.semantic.outputs.new_release_published == 'true'
with:
token: ${{ env.GITHUB_TOKEN }}
issue-number: ${{ steps.pr.outputs.result }}
body: |
Hey @${{ steps.login.outputs.result }},
This merge has triggered a release, hurray!
[Here you can follow the release.](https://github.com/trilom/file-changes-action/pull/${{ steps.create-pr.outputs.pr_number }})
Please use this new **Pull Request** if there are any issues to communicate further.
Thanks!
# - uses: actions/github-script@0.6.0
# if: steps.create-pr.outputs.pr_number != '' && steps.semantic.outputs.new_release_published == 'true'
# with:
# github-token: ${{ secrets.TRILOM_BOT_TOKEN }}
# script: |
# github.issues.addLabels({owner: context.repo.owner, repo: context.repo.repo, issue_number: ${{ steps.create-pr.outputs.pr_number }},
# labels: ['${{ steps.semantic.outputs.new_release_version }}']
# })
# make sure we can build
build:
name: yarn install && tsc
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- run: yarn build
# unit test with jest
test-unit:
name: jest unit tests
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- run: yarn build
- run: yarn test-coverage
- run: bash <(curl -s https://codecov.io/bash)
# integration test with jest
test-integration:
name: jest integration tests
needs: test-unit
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- run: yarn build
- run: yarn test-integration
env:
GITHUB_TOKEN: ${{ secrets.TRILOM_BOT_TOKEN }}
# lint code and comment back if possible
lintdog:
name: eslintdog (reviewdog)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Lint and report push
uses: reviewdog/action-eslint@v1
with:
github_token: ${{ secrets.TRILOM_BOT_TOKEN }}
reporter: github-check
eslint_flags: 'src/**/*.ts'

View File

@ -1,86 +0,0 @@
# Workflow Information
- [Workflow Information](#workflow-information)
- [Overview](#overview)
- [Schedule](#schedule)
- [Issue Comment](#issue-comment)
- [Pull Request](#pull-request)
- [Push](#push)
# Overview
1. Make a **Pull Request** from your forked branch (forked from _master_) with changes to _trilom/file-changes-action/master_ branch.
2. Once merged into master this will lint the code and provide output in the checks, update the AUTHORS file, and package _dist/_. If there is a release then this will create a **Pull Request** from the _v\*\*_ branch to _master_ and a comment will be made on the original **Pull Request** notifying contributors. If there is not a release the changes will be **push**ed to _master_.
3. In the **Pull Request** linting and testing will be performed again. If _linted_, _tested-unit_, _tested-integration_, _builds_, and _lintdogged_ label exist and _hold merge_ does not the release will be merged into _master_.
4. Once merged this time [semantic-release](https://github.com/semantic-release/semantic-release) will run to create the Github Release, release notes, changelog, notify Slack, package and deploy to NPM and Github Package Repo, label the release, and notify any issues of it's deployment.
5. After user semantic-release-bot commits the release commit, this code will be pushed to the release branch.
## Schedule
- Everyday at 5:00 AM GMT:
- Run integration tests via Github Actions.
## Issue Comment
- When any `created` **Issue Comment** type runs on a **Pull Request** from trilom with the body of `/integrationNUMBER`(**integration.yml**):
- Run integration tests via Github Actions with PR.
- **NOT IMPLEMENTED** When any `created` **Issue Comment** type runs on a **Pull Request** from trilom with the body of `/release`(**automerge.yml**):
- If _linted_, _tested-unit_, _tested-integration_, _builds_, _lintdogged_, and _hold merge_ or _automated merge_ **does not** labels exist:
- Merge the PR and add the _automated merge_ label
- If failure, put some output on the original PR.
## Pull Request
- When any `opened`, `reopened`, or `synchronize` **Pull Request** type runs to the _master_ branch from a _v\*\*_ branch:
- Run integration tests via Github Actions.
- When any `opened` or `reopened` **Pull Request** type runs on any branch other than _master_ from anyone other than trilom or trilom-bot from a forked branch(**close_pr.yml**):
- Close the **Pull Request** and put the dunce cap on.
- When any `labeled`, or `closed` **Pull Request** type runs on _master_, _next_, _alpha_, or _beta_(**automerge.yml**):
- If _linted_, _tested-unit_, _tested-integration_, _builds_, _lintdogged_, and _hold merge_ or _automated merge_ **does not** labels exist:
- Merge the PR and add the _automated merge_ label
- If failure, put some output on the original PR.
- When any `opened`, `reopened`, or `synchronize` **Pull Request** type runs(**pr.yml**):
- Assign it to trilom (**add-reviews**)
- Build code with `yarn build` which runs `yarn` and `tsc` (**build**)
- Label with builds if passing and on inner workspace
- Test code with `yarn test-coverage` which runs `jest --coverage` (**test-unit**)
- Label with tested-unit if passing and on inner workspace
- Test code with `yarn test-integration` which runs `jest -c jest.config.integration.js` (**test-integration**)
- Label with tested-integration if passing and on inner workspace
- Test code with eslint reviewdog and report back if inner workspace (**lintdog**)
- Label with pretty if passing and on inner workspace
- Check format of code with `yarn format-check` which runs `prettier --check` (**format_check_push**)
- If:
- Fork then pull **Pull Request** github.ref with GITHUB_TOKEN
- Inner **Pull Request** then pull HEAD repo ref
- Build code with `yarn build` which runs `yarn` and `tsc`
- If format-check succeeds and on inner workspace
- Label with pretty
- If format-check fails and on inner workspace and actor is not trilom-bot
- Run `yarn format` which runs `prettier --write`
- Clean build files with `yarn clean`
- Commit the format changes as trilom-bot to **Pull Request** head
## Push
- When any **Push** type runs to _master_:
- Run integration tests via Github Actions.
- When any **Push** type runs to _master_, _next_, _alpha_, or _beta_(**push.yml**):
- Build code with `yarn build` which runs `yarn` and `tsc` (**build**)
- Test code with `yarn test-coverage` which runs `jest` (**test**)
- Test code with eslint reviewdog and report back with github checks(**lintdog**)
- When any **Push** type runs to _master_, _next_, _alpha_, or _beta_ with a head_commit message **NOT** containing 'trilom/v1.' or 'trilom/v2.':
- Build with `yarn build-release` which runs `yarn && tsc --build tsconfig.build.json && ncc build --minify` to build the **dist/\*\*.js** files, update **AUTHORS**, format **src/\*\*.ts** files and commit.
- Test [semantic-release](https://github.com/semantic-release/semantic-release) if a release is ready then create a **Pull Request**
- Echo release outputs
- Get changed files with [file-changes-action](https://github.com/trilom/file-changes-action) and build a message to post to new **Pull Request**
- Comment on the original **Pull Request** with the new details of the release.
- If no release, then **Push** changes directly back to master.
- When any **Push** type runs to _master_, _next_, _alpha_, or _beta_ with a head_commit message containing 'trilom/v1.' or 'trilom/v2.':
- Run [semantic-release](https://github.com/semantic-release/semantic-release) to prepare Github Release, release notes, changelog, notify Slack, package and deploy to NPM and Github Package Repo, label the release, and notify any issues of it's deployment.
- When any **Push** type runs to _master_, _next_, _alpha_, or _beta_ from semantic-release-bot with a head_commit message containing 'chore(release):':
- Get the **Pull Request** number from the **Push** and push the semantic-release changes to the tagged release branch.

View File

@ -1,100 +0,0 @@
lib
**/outputs/**
# Dependency directory
node_modules
# Rest pulled from https://github.com/github/gitignore/blob/master/Node.gitignore
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
jspm_packages/
# # TypeScript v1 declaration files
# typings/
# TypeScript cache
*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
# parcel-bundler cache (https://parceljs.org/)
.cache
# next.js build output
.next
# nuxt.js build output
.nuxt
# vuepress build output
.vuepress/dist
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# OS metadata
.DS_Store
Thumbs.db
# Ignore built ts files
__tests__/runner/*

View File

@ -1,2 +0,0 @@
/dist
/node_modules

View File

@ -1,9 +0,0 @@
printWidth: 80
tabWidth: 2
useTabs: false
semi: false
singleQuote: true
trailingComma: none
bracketSpacing: false
arrowParens: avoid
parser: typescript

View File

@ -1,62 +0,0 @@
branches:
- "+([1-9])?(.{+([1-9]),x}).x"
- master
- next
- name: alpha
prerelease: true
- name: beta
prerelease: true
dryRun: false
plugins:
- "@semantic-release/commit-analyzer"
- "@semantic-release/release-notes-generator"
-
- semantic-release-slack-bot
- notifyOnSuccess: true
notifyOnFail: true
markdownReleaseNotes: true
onSuccessTemplate:
text: "$package_name version v$npm_package_version!\n\n$release_notes"
-
- "@semantic-release/changelog"
- changelogFile: CHANGELOG.md
- "@semantic-release/npm"
-
- "@semantic-release/github"
- assets:
- path: "dist/**/*.js"
label: Packaged JS Code
successComment: >
# 🎉🦍🎉 This <% issue.pull_request ? 'pull request' : 'issue' %>
has been resolved in version *<%= nextRelease.version %>* at
*trilom/file-changes-action@<%= nextRelease.gitTag %>*
` - name: File Changes Action
uses: trilom/file-changes-action@<%= nextRelease.gitTag %>`
## Release<%= _.size(releases) > 1 ? 's' : '' %>
<% _.forEach(releases, function(release) { %>
\n\t\t**Release Name:** [<%= release.name %>](<%= release.url %>)<% }); %>
\n\n ## Commits<% _.forEach(commits, function(commit) { %>
\n\t\t@<%= commit.author %> - [_<%= commit.message %>_](https://github.com/trilom/file-changes-action/commit/<%= commit.hash %>)<% }); %>"
**Release Name:** [<%= release.name %>](<%= release.url %>)<% }); %>
## Commits
<% _.forEach(commits, function(commit) { %>
@<%= commit.author.name %> - [_<%= commit.message.toString().replace(/[()\\\/_\*]/g, '') %>_](https://github.com/trilom/file-changes-action/commit/<%= commit.hash %>)<% }); %>
labels: [failure]
releasedLabels: ["releases/${nextRelease.gitTag}"]
assignees: trilom
-
- "@semantic-release/git"
- assets: [CHANGELOG.md, package.json, yarn.lock]
message: >
chore(release): 🎉🦍🎉 Release <%= nextRelease.version %> -
<%= new Date().toLocaleDateString('en-US', {year: 'numeric', month: 'short', day: 'numeric', hour: 'numeric', minute: 'numeric' }) %> [skip ci]
`- name: File Changes Action
uses: trilom/file-changes-action@<%= nextRelease.gitTag %>`
<%= nextRelease.notes %>

View File

@ -1,5 +0,0 @@
Bryan Killian <bryan.v.killian@gmail.com>
Daniel Orner <daniel.orner@wishabi.com>
Sergey Kluchkovsky <kaineer@gmail.com>
Thu May 21 14:42:36 UTC 2020

View File

@ -1,45 +0,0 @@
## [1.2.4](https://github.com/trilom/file-changes-action/compare/v1.2.3...v1.2.4) (2020-05-21)
### Bug Fixes
* **change in api:** github api had a change, this should trigger release 1.2.4. this change here quiets a quacker during the intergration test ([99f8f91](https://github.com/trilom/file-changes-action/commit/99f8f91f3ed1430713973d8f1e2848b5acc58163))
## [1.2.3](https://github.com/trilom/file-changes-action/compare/v1.2.2...v1.2.3) (2020-03-25)
### Bug Fixes
* **test release:** testing a release ([dfca448](https://github.com/trilom/file-changes-action/commit/dfca448d9d1f04825a549ba0bc7d6b097df295a2))
## [1.2.2](https://github.com/trilom/file-changes-action/compare/v1.2.1...v1.2.2) (2020-03-25)
### Bug Fixes
* **issue_comment:** this needs to return PR info not commit info if before and after explicitly set, else PR ([eee976b](https://github.com/trilom/file-changes-action/commit/eee976b2219f243f83583baab84fa89376006acc))
* **naming:** renamed "deleted" to "removed". sorry if this is breaking for you. ([800537f](https://github.com/trilom/file-changes-action/commit/800537f435a66454c64fc2b42cfd82ca33cc093d))
* **pull_request_synchronize events:** issue with PR Synchronize events, it would return commit files instead of PR files, this is adjusted to return ALL PR files with PR synchronize event ([fb7bcc7](https://github.com/trilom/file-changes-action/commit/fb7bcc76581402f20aa64da82cd1174e313ec02c))
* **space issue:** this should resolve the issue with using a blank space. the assumption here is that 'json' is default, if you use ' ' it will be '' which is the app default, not the action default of 'json' ([0e4184f](https://github.com/trilom/file-changes-action/commit/0e4184fe04f87323c60b71c1ccf2af95f9f35b8c)), closes [#81](https://github.com/trilom/file-changes-action/issues/81)
## [1.2.1](https://github.com/trilom/file-changes-action/compare/v1.2.0...v1.2.1) (2020-03-19)
### Bug Fixes
* **everything:** very proud to say this is 100% coverage according to default jest of all src code (including test) ([dd31d02](https://github.com/trilom/file-changes-action/commit/dd31d0220fdc9e6eb3469b3443239359d7da33d4))
* **redesign:** a lot of things changed here in the project ([32903fd](https://github.com/trilom/file-changes-action/commit/32903fd341ce6a5471e3df73393784cb43adb397))
# [1.2.0](https://github.com/trilom/file-changes-action/compare/v1.1.0...v1.2.0) (2020-03-02)
### Features
* **action:** githubToken is optional (uses action token), added githubRepo, prNumber, and pushBefore & After ([b24e2c3](https://github.com/trilom/file-changes-action/commit/b24e2c30c72710da8704a02f9d05141a19f27f83))
# [1.2.0](https://github.com/trilom/file-changes-action/compare/v1.1.0...v1.2.0) (2020-03-02)
### Features
* **action:** githubToken is optional (uses action token), added githubRepo, prNumber, and pushBefore & After ([b24e2c3](https://github.com/trilom/file-changes-action/commit/b24e2c30c72710da8704a02f9d05141a19f27f83))

View File

@ -1,22 +0,0 @@
The MIT License (MIT)
Copyright (c) 2018 GitHub, Inc. and contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@ -1,201 +0,0 @@
# file-changes-action
[![codecov](https://codecov.io/gh/trilom/file-changes-action/branch/master/graph/badge.svg)](https://codecov.io/gh/trilom/file-changes-action)
[![code style: prettier](https://img.shields.io/badge/code_style-prettier-ff69b4.svg?style=flat-square)](https://github.com/prettier/prettier)
![Integration Tests](https://github.com/trilom/file-changes-action/workflows/Integration%20Tests/badge.svg)
# Like my work? Hire me!
> Please reach out if you need something built!
This action will take the information from the Push/Pull Request and output some variables and write files that will let you know what was changed, removed, or added.
## Inputs
### githubRepo
_Optional_ - `string` - the github repository you want to compare changes from, defaults to the github.repository.
### githubToken
_Optional_ - `string` - specific github token, github.token is used by default (Github Action Runner)
### output
_Optional_ - `string` - type of output for output variables, default is json. Use ',' for comma separated values, or ' ' for space delimited values. You can also create your own delimiter for example ' |FILE:' will output 'file1.yml |FILE:file2.yml |FILE:file3.yml'.
### fileOutput
_Optional_ - `string` - type of output for file output, default is json. Use ',' for comma separated values, or ' ' for space delimited values. You can also create your own delimiter for example `\ |FILE:` will output:
> file1.yml |FILE:file2.yml |FILE:file3.yml
If you select json then the file format will be .json, if you select ',' then the file format will be .csv, anything else will output the files as .txt
### pushBefore
_Optional_ - `string` - pass in a specific sha to compare to as a before, required if using pushAfter. (push payload after github.payload.before)
### pushAfter
_Optional_ - `string` - pass in a specific sha to compare to as an after, required if using pushBefore. (push payload after github.payload.after)
### prNumber
_Optional_ - `string` - pass in a specific PR number to get file changes from.
## Outputs
### files
steps.file_changes.outputs.files - `string` - The names all new, updated, and removed files. The output is dependant on the output input, default is a json string.
### files_added
steps.file_changes.outputs.files_added - `string` - The names of the newly created files. The output is dependant on the output input, default is a json string.
### files_modified
steps.file_changes.outputs.files_modified - `string` - The names of the updated files. The output is dependant on the output input, default is a json string.
### files_removed
steps.file_changes.outputs.files_removed - `string` - The names of the removed files. The output is dependant on the output input, default is a json string.
## Example usage
```yaml
# bare minimal
name: changes
on: push
jobs:
changes:
runs-on: ubuntu-latest
steps:
- id: file_changes
uses: trilom/file-changes-action@v1.2.3
### full
name: changes
on: [push, pull_request] # push or pull, or any event with custom pr number or before/after commit sha
jobs:
changes:
runs-on: ubuntu-latest
steps:
- id: file_changes
uses: trilom/file-changes-action@v1.2.3
with:
# optional target repo
githubRepo: trilom/file-changes-action
# optional token
githubToken: ${{ secrets.BOT_TOKEN }}
# optional output format
output: 'json'
# optional fileoutput format
fileOutput: 'csv'
# optional push before SHA (need both before and after)
pushBefore: 79eeec74aebc3deb0a2f6234c5ac13142e9224e5
# optional push after SHA (need both before and after)
pushAfter: 1c5a2bfde79e2c9cffb75b9a455391350fe69a40
# optional PR number to compare
prNumber: 36
```
## How to Use
In order to make those decisions we need to know what files have changed and that is where this action comes in. In the example below we are checking out our repository code, and then running the `trilom/file-changes-action@v1` action. The only thing you need to provide is a GITHUB_TOKEN so that Octokit can make it's API calls.
If a PR is made then it will look at all of the files included in the PR.
If a push is made then it will compare commits from the SHA `github.payload.before` to the SHA `github.payload.after` of the push.
After gathering this information it will output the files in 2 ways.
- As an output variable, you can use this variable by using `steps.file_changes_outputs_files`, `steps.file_changes.outputs.files_modified`, `steps.file_changes.outputs.files_added`, `steps.file_changes.outputs.files_removed`.
- As a file on the container stored at `$HOME/files.json`, `$HOME/files_modified.json`, `$HOME/files_added.json`, `$HOME/files_removed.json`.
- _NOTE:_ If you set a custom delimiter in output or fileOutput inputs then you will receive different files. For example a delimiter of ',' will output at `$HOME/files.csv` instead of `$HOME/files.json`. Likewise, anything other than 'json' or ',' delmiters will output `$HOME/files.txt` files instead of `$HOME/files.json` by default.
## Use Cases
I have a process where I have AWS Cloudformation templates stored in one directory that might be named PRODUCT-ROLE, and mappings for these templates that span the PRODUCT. For example **mappings/wordpress.mappings.yml, templates/wordpress-database.yml, templates/wordpress-webserver.yml**, and some of the templates might use different Lambda functions defined in for example **functions/wordpress-webserver/**.
In the example below we have a workflow that on *push* to the develop branch we can perform some actions based on the files. In my use case I look for changes on the develop branch of this repository for every push that happens. When a push happens and a change is made to any of the paths below the workflow will trigger. With this action you are able to know exactly which files changed so that you can make decisions later in your CI/CD.
In this case, if a **templates/*.yml** file is changed, then we want to update the Cloudformation stack. We can also write specifics for related templates. For example, if **templates/wordpress-database.yml** changes then we want to deploy **templates/wordpress-webserver.yml** as well after.
Another case is if the **mappings/wordpress.mappings.yml** changes, we want to deploy all **template/wordpress-*.yml** files.
## More examples
```yaml
name: push-develop
on: [push]
jobs:
changes:
runs-on: ubuntu-latest
steps:
- id: file_changes
uses: trilom/file-changes-action@v1.2.3
- name: test
run: |
cat $HOME/files.json
cat $HOME/files_modified.json
cat $HOME/files_added.json
cat $HOME/files_removed.json
echo '${{ steps.file_changes.outputs.files}}'
echo '${{ steps.file_changes.outputs.files_modified}}'
echo '${{ steps.file_changes.outputs.files_added}}'
echo '${{ steps.file_changes.outputs.files_removed}}'
```
You can set the output and fileOutput to ',' for csv output.
```yaml
name: push-develop
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- id: file_changes
uses: trilom/file-changes-action@v1.2.3
with:
output: ','
fileOutput: ','
- name: test
run: |
cat $HOME/files.csv
```
You can set the output and fileOutput to ' ' for txt output. We also used a specific token, and got info for the PR that this push came from.
```yaml
name: push-develop
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/github-script@0.6.0
id: pr
with:
github-token: ${{env.BOT_USER_TOKEN}}
result-encoding: string
script: |
const result = await github.repos.listPullRequestsAssociatedWithCommit({
owner: context.payload.repository.owner.name,
repo: context.payload.repository.name,
commit_sha: context.payload.head_commit.id
})
return result.data[0].number;
- id: file_changes
uses: trilom/file-changes-action@v1.2.3
with:
githubToken: ${{ env.BOT_USER_TOKEN }}
prNumber: ${{ steps.pr.outputs.results }}
output: ' '
fileOutput: ' '
- name: test
run: |
cat $HOME/files.txt
```

View File

@ -1,43 +0,0 @@
name: 'File Changes Action'
description: 'Creates outputs variables of files modified, added, or removed by a PR or Push.'
author: 'Bryan Killian <me@trilom.org>'
inputs:
githubRepo:
description: 'The github repository you want to compare changes from, defaults to the github.repository.'
required: false
githubToken:
description: 'The github action token will be used by default, if you want to use something different than you can pass it in here.'
default: ${{ github.token }}
required: true
pushBefore:
description: 'Pass in a specific sha to compare to as a before, required if using pushAfter. (push BASE payload after github.payload.before)'
required: false
pushAfter:
description: 'Pass in a specific sha to compare to as an after, required if using pushBefore. (push HEAD payload after github.payload.after)'
required: false
prNumber:
description: 'Pass in a specific PR number to get file changes from.'
required: false
output:
description: 'Choose between json (default), or custom delimiter by passing a string, for example '','' for csv variable output'
required: true
default: json
fileOutput:
description: 'Choose between json (default), or custom delimiter by passing a string, for example '','' for csv file output. If you set as json the file output will be suffixed with .json, if you select '','' then the output will be .csv, else .txt will be the output.'
required: true
default: json
outputs:
files:
description: 'The names all new, updated, and removed files'
files_added:
description: 'The names of the newly created files'
files_modified:
description: 'The names of the updated files'
files_removed:
description: 'The names of the removed files'
runs:
using: 'node12'
main: 'dist/index.js'
branding:
icon: 'file-text'
color: 'red'

File diff suppressed because one or more lines are too long

View File

@ -1,5 +0,0 @@
var config = require('./jest.config')
config.testPathIgnorePatterns = ['!/src/tests/integration.test.ts']
config.testMatch = ['**/integration.test.ts']
console.log('RUNNING INTEGRATION TESTS')
module.exports = config

View File

@ -1,29 +0,0 @@
module.exports = {
preset: 'ts-jest',
testEnvironment: "node",
testRunner: 'jest-circus/runner',
testMatch: ['**/*.test.ts'],
testPathIgnorePatterns: ['/src/tests/integration.test.ts'],
clearMocks: true,
collectCoverage: false,
coverageThreshold: {
global: {
branches: 50,
functions: 70,
lines: 75,
statements: 75
},
'./src/*.ts': {
branches: 70,
functions: 85,
lines: 85,
statements: 85
},
'./src/tests/**/*.ts': {
branches: 50,
functions: 60,
lines: 65,
statements: 65
}
}
}

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More