Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add save and restore actions #41

Merged
merged 5 commits into from
Feb 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 58 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -140,3 +140,61 @@ jobs:
- name: Verify cache files outside working directory
shell: bash
run: src/verify-cache-files.sh ${{ runner.os }} ~/test-cache

test-save-only:
needs: create-minio-bucket
strategy:
matrix:
os: [ ubuntu-latest, windows-latest, macOS-latest ]
fail-fast: false
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Generate files in working directory
shell: bash
run: src/create-cache-files.sh ${{ runner.os }} test-cache
- name: Generate files outside working directory
shell: bash
run: src/create-cache-files.sh ${{ runner.os }} ~/test-cache
- name: Save cache
uses: ./save/
with:
endpoint: play.min.io
accessKey: "Q3AM3UQ867SPQQA43P2F"
secretKey: "zuf+tfteSlswRu7BJ86wekitnifILbZam1KYY3TG"
bucket: actions-cache
use-fallback: false
key: test-save-only-${{ runner.os }}-${{ github.run_id }}
path: |
test-cache
~/test-cache

test-restore-only:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure this tests the new functionality. Can you elaborate how does this test the restore-only?

In my previous comment, I meant that you should make a new file and use your pattern of "save" and "restore" actions.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note how it uses the new sub-action that restores only via the uses: ./restore/ here:

https://github.com/mattlewis92/actions-cache/blob/18d4ac58ee9683f4c7ea366fee85bcc0fa578240/.github/workflows/test.yml#L184

They could be put into a different workflow, but it seemed redundant to me as it also functions just fine in this one. I can move it into a separate workflow if you'd prefer though

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tests is setup so that it logs the machine ID it runs on and restores on the same machine. I know, it can be so much better, but it works so far. So a separate file will probably put it on a different machine. Hence tests the actual thing. If put in the same file it'll pick up from the previous step.

After written all these, I realised you do distinguish using the key. so all good for me.

needs: test-save-only
strategy:
matrix:
os: [ ubuntu-latest, windows-latest, macOS-latest ]
fail-fast: false
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Restore cache
uses: ./restore/
with:
endpoint: play.min.io
accessKey: "Q3AM3UQ867SPQQA43P2F"
secretKey: "zuf+tfteSlswRu7BJ86wekitnifILbZam1KYY3TG"
bucket: actions-cache
use-fallback: false
key: test-save-only-${{ runner.os }}-${{ github.run_id }}
path: |
test-cache
~/test-cache
- name: Verify cache files in working directory
shell: bash
run: src/verify-cache-files.sh ${{ runner.os }} test-cache
- name: Verify cache files outside working directory
shell: bash
run: src/verify-cache-files.sh ${{ runner.os }} ~/test-cache
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -94,4 +94,5 @@ typings/

# Text editor files
.vscode/
.idea/
src/test.ts
28 changes: 28 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,34 @@ You can also set env instead of using `with`:
~/test-cache
```

To write to the cache only:

```yaml
- uses: tespkg/actions-cache/save@v1
with:
accessKey: "Q3AM3UQ867SPQQA43P2F" # required
secretKey: "zuf+tfteSlswRu7BJ86wekitnifILbZam1KYY3TG" # required
bucket: actions-cache # required
# actions/cache compatible properties: https://github.com/actions/cache
key: ${{ runner.os }}-yarn-${{ hashFiles('**/yarn.lock') }}
path: |
node_modules
```

To restore from the cache only:

```yaml
- uses: tespkg/actions-cache/restore@v1
with:
accessKey: "Q3AM3UQ867SPQQA43P2F" # required
secretKey: "zuf+tfteSlswRu7BJ86wekitnifILbZam1KYY3TG" # required
bucket: actions-cache # required
# actions/cache compatible properties: https://github.com/actions/cache
key: ${{ runner.os }}-yarn-${{ hashFiles('**/yarn.lock') }}
path: |
node_modules
```

## Restore keys

`restore-keys` works similar to how github's `@actions/cache@v2` works: It search each item in `restore-keys`
Expand Down
68 changes: 67 additions & 1 deletion dist/restore/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -102749,12 +102749,18 @@ var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, ge
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.isExactKeyMatch = exports.saveMatchedKey = exports.listObjects = exports.findObject = exports.setCacheSizeOutput = exports.setCacheHitOutput = exports.formatSize = exports.getInputAsInt = exports.getInputAsArray = exports.getInputAsBoolean = exports.newMinio = exports.getInput = exports.isGhes = void 0;
exports.saveCache = exports.isExactKeyMatch = exports.saveMatchedKey = exports.listObjects = exports.findObject = exports.setCacheSizeOutput = exports.setCacheHitOutput = exports.formatSize = exports.getInputAsInt = exports.getInputAsArray = exports.getInputAsBoolean = exports.newMinio = exports.getInput = exports.isGhes = void 0;
const utils = __importStar(__nccwpck_require__(1518));
const core = __importStar(__nccwpck_require__(2186));
const minio = __importStar(__nccwpck_require__(8308));
const state_1 = __nccwpck_require__(9738);
const path_1 = __importDefault(__nccwpck_require__(1017));
const tar_1 = __nccwpck_require__(6490);
const cache = __importStar(__nccwpck_require__(7799));
function isGhes() {
const ghUrl = new URL(process.env["GITHUB_SERVER_URL"] || "https://github.com");
return ghUrl.hostname.toUpperCase() !== "GITHUB.COM";
Expand Down Expand Up @@ -102889,6 +102895,66 @@ function isExactKeyMatch() {
return result;
}
exports.isExactKeyMatch = isExactKeyMatch;
function saveCache(standalone) {
return __awaiter(this, void 0, void 0, function* () {
try {
if (!standalone && isExactKeyMatch()) {
core.info("Cache was exact key match, not saving");
return;
}
const bucket = core.getInput("bucket", { required: true });
// Inputs are re-evaluted before the post action, so we want the original key
const key = standalone ? core.getInput("key", { required: true }) : core.getState(state_1.State.PrimaryKey);
const useFallback = getInputAsBoolean("use-fallback");
const paths = getInputAsArray("path");
try {
const mc = newMinio({
// Inputs are re-evaluted before the post action, so we want the original keys & tokens
accessKey: standalone ? getInput("accessKey", "AWS_ACCESS_KEY_ID") : core.getState(state_1.State.AccessKey),
secretKey: standalone ? getInput("secretKey", "AWS_SECRET_ACCESS_KEY") : core.getState(state_1.State.SecretKey),
sessionToken: standalone ? getInput("sessionToken", "AWS_SESSION_TOKEN") : core.getState(state_1.State.SessionToken),
region: standalone ? getInput("region", "AWS_REGION") : core.getState(state_1.State.Region),
});
const compressionMethod = yield utils.getCompressionMethod();
const cachePaths = yield utils.resolvePaths(paths);
core.debug("Cache Paths:");
core.debug(`${JSON.stringify(cachePaths)}`);
const archiveFolder = yield utils.createTempDirectory();
const cacheFileName = utils.getCacheFileName(compressionMethod);
const archivePath = path_1.default.join(archiveFolder, cacheFileName);
core.debug(`Archive Path: ${archivePath}`);
yield (0, tar_1.createTar)(archiveFolder, cachePaths, compressionMethod);
if (core.isDebug()) {
yield (0, tar_1.listTar)(archivePath, compressionMethod);
}
const object = path_1.default.join(key, cacheFileName);
core.info(`Uploading tar to s3. Bucket: ${bucket}, Object: ${object}`);
yield mc.fPutObject(bucket, object, archivePath, {});
core.info("Cache saved to s3 successfully");
}
catch (e) {
core.info("Save s3 cache failed: " + e.message);
if (useFallback) {
if (isGhes()) {
core.warning("Cache fallback is not supported on Github Enterpise.");
}
else {
core.info("Saving cache using fallback");
yield cache.saveCache(paths, key);
core.info("Save cache using fallback successfully");
}
}
else {
core.debug("skipped fallback cache");
}
}
}
catch (e) {
core.info("warning: " + e.message);
}
});
}
exports.saveCache = saveCache;


/***/ }),
Expand Down
143 changes: 68 additions & 75 deletions dist/save/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -102610,84 +102610,11 @@ var __importStar = (this && this.__importStar) || function (mod) {
__setModuleDefault(result, mod);
return result;
};
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
return new (P || (P = Promise))(function (resolve, reject) {
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
const cache = __importStar(__nccwpck_require__(7799));
const utils = __importStar(__nccwpck_require__(1518));
const tar_1 = __nccwpck_require__(6490);
const core = __importStar(__nccwpck_require__(2186));
const path = __importStar(__nccwpck_require__(1017));
const state_1 = __nccwpck_require__(9738);
const utils_1 = __nccwpck_require__(1314);
process.on("uncaughtException", (e) => core.info("warning: " + e.message));
function saveCache() {
return __awaiter(this, void 0, void 0, function* () {
try {
if ((0, utils_1.isExactKeyMatch)()) {
core.info("Cache was exact key match, not saving");
return;
}
const bucket = core.getInput("bucket", { required: true });
// Inputs are re-evaluted before the post action, so we want the original key
const key = core.getState(state_1.State.PrimaryKey);
const useFallback = (0, utils_1.getInputAsBoolean)("use-fallback");
const paths = (0, utils_1.getInputAsArray)("path");
try {
const mc = (0, utils_1.newMinio)({
// Inputs are re-evaluted before the post action, so we want the original keys & tokens
accessKey: core.getState(state_1.State.AccessKey),
secretKey: core.getState(state_1.State.SecretKey),
sessionToken: core.getState(state_1.State.SessionToken),
region: core.getState(state_1.State.Region),
});
const compressionMethod = yield utils.getCompressionMethod();
const cachePaths = yield utils.resolvePaths(paths);
core.debug("Cache Paths:");
core.debug(`${JSON.stringify(cachePaths)}`);
const archiveFolder = yield utils.createTempDirectory();
const cacheFileName = utils.getCacheFileName(compressionMethod);
const archivePath = path.join(archiveFolder, cacheFileName);
core.debug(`Archive Path: ${archivePath}`);
yield (0, tar_1.createTar)(archiveFolder, cachePaths, compressionMethod);
if (core.isDebug()) {
yield (0, tar_1.listTar)(archivePath, compressionMethod);
}
const object = path.join(key, cacheFileName);
core.info(`Uploading tar to s3. Bucket: ${bucket}, Object: ${object}`);
yield mc.fPutObject(bucket, object, archivePath, {});
core.info("Cache saved to s3 successfully");
}
catch (e) {
core.info("Save s3 cache failed: " + e.message);
if (useFallback) {
if ((0, utils_1.isGhes)()) {
core.warning("Cache fallback is not supported on Github Enterpise.");
}
else {
core.info("Saving cache using fallback");
yield cache.saveCache(paths, key);
core.info("Save cache using fallback successfully");
}
}
else {
core.debug("skipped fallback cache");
}
}
}
catch (e) {
core.info("warning: " + e.message);
}
});
}
saveCache();
(0, utils_1.saveCache)(false);


/***/ }),
Expand Down Expand Up @@ -102749,12 +102676,18 @@ var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, ge
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
var __importDefault = (this && this.__importDefault) || function (mod) {
return (mod && mod.__esModule) ? mod : { "default": mod };
};
Object.defineProperty(exports, "__esModule", ({ value: true }));
exports.isExactKeyMatch = exports.saveMatchedKey = exports.listObjects = exports.findObject = exports.setCacheSizeOutput = exports.setCacheHitOutput = exports.formatSize = exports.getInputAsInt = exports.getInputAsArray = exports.getInputAsBoolean = exports.newMinio = exports.getInput = exports.isGhes = void 0;
exports.saveCache = exports.isExactKeyMatch = exports.saveMatchedKey = exports.listObjects = exports.findObject = exports.setCacheSizeOutput = exports.setCacheHitOutput = exports.formatSize = exports.getInputAsInt = exports.getInputAsArray = exports.getInputAsBoolean = exports.newMinio = exports.getInput = exports.isGhes = void 0;
const utils = __importStar(__nccwpck_require__(1518));
const core = __importStar(__nccwpck_require__(2186));
const minio = __importStar(__nccwpck_require__(8308));
const state_1 = __nccwpck_require__(9738);
const path_1 = __importDefault(__nccwpck_require__(1017));
const tar_1 = __nccwpck_require__(6490);
const cache = __importStar(__nccwpck_require__(7799));
function isGhes() {
const ghUrl = new URL(process.env["GITHUB_SERVER_URL"] || "https://github.com");
return ghUrl.hostname.toUpperCase() !== "GITHUB.COM";
Expand Down Expand Up @@ -102889,6 +102822,66 @@ function isExactKeyMatch() {
return result;
}
exports.isExactKeyMatch = isExactKeyMatch;
function saveCache(standalone) {
return __awaiter(this, void 0, void 0, function* () {
try {
if (!standalone && isExactKeyMatch()) {
core.info("Cache was exact key match, not saving");
return;
}
const bucket = core.getInput("bucket", { required: true });
// Inputs are re-evaluted before the post action, so we want the original key
const key = standalone ? core.getInput("key", { required: true }) : core.getState(state_1.State.PrimaryKey);
const useFallback = getInputAsBoolean("use-fallback");
const paths = getInputAsArray("path");
try {
const mc = newMinio({
// Inputs are re-evaluted before the post action, so we want the original keys & tokens
accessKey: standalone ? getInput("accessKey", "AWS_ACCESS_KEY_ID") : core.getState(state_1.State.AccessKey),
secretKey: standalone ? getInput("secretKey", "AWS_SECRET_ACCESS_KEY") : core.getState(state_1.State.SecretKey),
sessionToken: standalone ? getInput("sessionToken", "AWS_SESSION_TOKEN") : core.getState(state_1.State.SessionToken),
region: standalone ? getInput("region", "AWS_REGION") : core.getState(state_1.State.Region),
});
const compressionMethod = yield utils.getCompressionMethod();
const cachePaths = yield utils.resolvePaths(paths);
core.debug("Cache Paths:");
core.debug(`${JSON.stringify(cachePaths)}`);
const archiveFolder = yield utils.createTempDirectory();
const cacheFileName = utils.getCacheFileName(compressionMethod);
const archivePath = path_1.default.join(archiveFolder, cacheFileName);
core.debug(`Archive Path: ${archivePath}`);
yield (0, tar_1.createTar)(archiveFolder, cachePaths, compressionMethod);
if (core.isDebug()) {
yield (0, tar_1.listTar)(archivePath, compressionMethod);
}
const object = path_1.default.join(key, cacheFileName);
core.info(`Uploading tar to s3. Bucket: ${bucket}, Object: ${object}`);
yield mc.fPutObject(bucket, object, archivePath, {});
core.info("Cache saved to s3 successfully");
}
catch (e) {
core.info("Save s3 cache failed: " + e.message);
if (useFallback) {
if (isGhes()) {
core.warning("Cache fallback is not supported on Github Enterpise.");
}
else {
core.info("Saving cache using fallback");
yield cache.saveCache(paths, key);
core.info("Save cache using fallback successfully");
}
}
else {
core.debug("skipped fallback cache");
}
}
}
catch (e) {
core.info("warning: " + e.message);
}
});
}
exports.saveCache = saveCache;


/***/ }),
Expand Down
Loading
Loading