Compare commits
	
		
			4 Commits
		
	
	
		
			phantsure/
			...
			Phantsure-
		
	
	| Author | SHA1 | Date | |
|---|---|---|---|
|   | cfa7ba9007 | ||
|   | 7d403ca5c3 | ||
|   | 8a9ab1ae8c | ||
|   | f10295073f | 
							
								
								
									
										2
									
								
								.licenses/npm/@actions/cache.dep.yml
									
									
									
										generated
									
									
									
								
							
							
						
						
									
										2
									
								
								.licenses/npm/@actions/cache.dep.yml
									
									
									
										generated
									
									
									
								
							| @@ -1,6 +1,6 @@ | ||||
| --- | ||||
| name: "@actions/cache" | ||||
| version: 3.1.2 | ||||
| version: 3.1.0 | ||||
| type: npm | ||||
| summary: | ||||
| homepage: | ||||
|   | ||||
| @@ -28,6 +28,7 @@ See ["Caching dependencies to speed up workflows"](https://docs.github.com/en/ac | ||||
| * Fix zstd not working for windows on gnu tar in issues. | ||||
| * Allowing users to provide a custom timeout as input for aborting download of a cache segment using an environment variable `SEGMENT_DOWNLOAD_TIMEOUT_MINS`. Default is 60 minutes. | ||||
| * Two new actions available for granular control over caches - [restore](restore/action.yml) and [save](save/action.yml) | ||||
| * Add support for cross os caching. For example, a cache saved on windows can be restored on ubuntu and vice versa. | ||||
|  | ||||
| Refer [here](https://github.com/actions/cache/blob/v2/README.md) for previous versions | ||||
|  | ||||
| @@ -45,7 +46,7 @@ If you are using this inside a container, a POSIX-compliant `tar` needs to be in | ||||
| * `restore-keys` - An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key. | ||||
|  | ||||
| #### Environment Variables | ||||
| * `SEGMENT_DOWNLOAD_TIMEOUT_MINS` - Segment download timeout (in minutes, default `60`) to abort download of the segment if not completed in the defined number of minutes. [Read more](https://github.com/actions/cache/blob/main/tips-and-workarounds.md#cache-segment-restore-timeout) | ||||
| * `SEGMENT_DOWNLOAD_TIMEOUT_MINS` - Segment download timeout (in minutes, default `60`) to abort download of the segment if not completed in the defined number of minutes. [Read more](https://github.com/actions/cache/blob/main/workarounds.md#cache-segment-restore-timeout) | ||||
|  | ||||
| ### Outputs | ||||
|  | ||||
| @@ -121,7 +122,6 @@ See [Examples](examples.md) for a list of `actions/cache` implementations for us | ||||
| - [Swift, Objective-C - Carthage](./examples.md#swift-objective-c---carthage) | ||||
| - [Swift, Objective-C - CocoaPods](./examples.md#swift-objective-c---cocoapods) | ||||
| - [Swift - Swift Package Manager](./examples.md#swift---swift-package-manager) | ||||
| - [Swift - Mint](./examples.md#swift---mint) | ||||
|  | ||||
| ## Creating a cache key | ||||
|  | ||||
|   | ||||
| @@ -60,6 +60,3 @@ | ||||
| - Update `@actions/cache` on windows to use gnu tar and zstd by default and fallback to bsdtar and zstd if gnu tar is not available. ([issue](https://github.com/actions/cache/issues/984)) | ||||
| - Added support for fallback to gzip to restore old caches on windows. | ||||
| - Added logs for cache version in case of a cache miss. | ||||
|  | ||||
| ### 3.2.2 | ||||
| - Reverted the changes made in 3.2.1 to use gnu tar and zstd by default on windows. | ||||
| @@ -174,26 +174,6 @@ test("getInputAsInt throws if required and value missing", () => { | ||||
|     ).toThrowError(); | ||||
| }); | ||||
|  | ||||
| test("getInputAsBool returns false if input not set", () => { | ||||
|     expect(actionUtils.getInputAsBool("undefined")).toBe(false); | ||||
| }); | ||||
|  | ||||
| test("getInputAsBool returns value if input is valid", () => { | ||||
|     testUtils.setInput("foo", "true"); | ||||
|     expect(actionUtils.getInputAsBool("foo")).toBe(true); | ||||
| }); | ||||
|  | ||||
| test("getInputAsBool returns false if input is invalid or NaN", () => { | ||||
|     testUtils.setInput("foo", "bar"); | ||||
|     expect(actionUtils.getInputAsBool("foo")).toBe(false); | ||||
| }); | ||||
|  | ||||
| test("getInputAsBool throws if required and value missing", () => { | ||||
|     expect(() => | ||||
|         actionUtils.getInputAsBool("undefined2", { required: true }) | ||||
|     ).toThrowError(); | ||||
| }); | ||||
|  | ||||
| test("isCacheFeatureAvailable for ac enabled", () => { | ||||
|     jest.spyOn(cache, "isFeatureAvailable").mockImplementation(() => true); | ||||
|  | ||||
|   | ||||
| @@ -27,17 +27,9 @@ beforeAll(() => { | ||||
|             return actualUtils.getInputAsArray(name, options); | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "getInputAsBool").mockImplementation( | ||||
|         (name, options) => { | ||||
|             const actualUtils = jest.requireActual("../src/utils/actionUtils"); | ||||
|             return actualUtils.getInputAsBool(name, options); | ||||
|         } | ||||
|     ); | ||||
| }); | ||||
|  | ||||
| beforeEach(() => { | ||||
|     jest.restoreAllMocks(); | ||||
|     process.env[Events.Key] = Events.Push; | ||||
|     process.env[RefKey] = "refs/heads/feature-branch"; | ||||
|  | ||||
| @@ -58,8 +50,7 @@ test("restore with no cache found", async () => { | ||||
|     const key = "node-test"; | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -74,7 +65,7 @@ test("restore with no cache found", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(stateMock).toHaveBeenCalledTimes(1); | ||||
| @@ -93,8 +84,7 @@ test("restore with restore keys and no cache found", async () => { | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         restoreKeys: [restoreKey], | ||||
|         enableCrossOsArchive: false | ||||
|         restoreKeys: [restoreKey] | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -109,13 +99,7 @@ test("restore with restore keys and no cache found", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith( | ||||
|         [path], | ||||
|         key, | ||||
|         [restoreKey], | ||||
|         {}, | ||||
|         false | ||||
|     ); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(stateMock).toHaveBeenCalledTimes(1); | ||||
| @@ -132,8 +116,7 @@ test("restore with cache found for key", async () => { | ||||
|     const key = "node-test"; | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -149,7 +132,7 @@ test("restore with cache found for key", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", key); | ||||
| @@ -169,8 +152,7 @@ test("restore with cache found for restore key", async () => { | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         restoreKeys: [restoreKey], | ||||
|         enableCrossOsArchive: false | ||||
|         restoreKeys: [restoreKey] | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -186,13 +168,7 @@ test("restore with cache found for restore key", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith( | ||||
|         [path], | ||||
|         key, | ||||
|         [restoreKey], | ||||
|         {}, | ||||
|         false | ||||
|     ); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_RESULT", restoreKey); | ||||
|   | ||||
| @@ -28,17 +28,9 @@ beforeAll(() => { | ||||
|             return actualUtils.getInputAsArray(name, options); | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "getInputAsBool").mockImplementation( | ||||
|         (name, options) => { | ||||
|             const actualUtils = jest.requireActual("../src/utils/actionUtils"); | ||||
|             return actualUtils.getInputAsBool(name, options); | ||||
|         } | ||||
|     ); | ||||
| }); | ||||
|  | ||||
| beforeEach(() => { | ||||
|     jest.restoreAllMocks(); | ||||
|     process.env[Events.Key] = Events.Push; | ||||
|     process.env[RefKey] = "refs/heads/feature-branch"; | ||||
|  | ||||
| @@ -105,8 +97,7 @@ test("restore on GHES with AC available ", async () => { | ||||
|     const key = "node-test"; | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -122,7 +113,7 @@ test("restore on GHES with AC available ", async () => { | ||||
|     await run(new StateProvider()); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); | ||||
| @@ -161,20 +152,13 @@ test("restore with too many keys should fail", async () => { | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         restoreKeys, | ||||
|         enableCrossOsArchive: false | ||||
|         restoreKeys | ||||
|     }); | ||||
|     const failedMock = jest.spyOn(core, "setFailed"); | ||||
|     const restoreCacheMock = jest.spyOn(cache, "restoreCache"); | ||||
|     await run(new StateProvider()); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith( | ||||
|         [path], | ||||
|         key, | ||||
|         restoreKeys, | ||||
|         {}, | ||||
|         false | ||||
|     ); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, restoreKeys); | ||||
|     expect(failedMock).toHaveBeenCalledWith( | ||||
|         `Key Validation Error: Keys are limited to a maximum of 10.` | ||||
|     ); | ||||
| @@ -185,14 +169,13 @@ test("restore with large key should fail", async () => { | ||||
|     const key = "foo".repeat(512); // Over the 512 character limit | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|     const failedMock = jest.spyOn(core, "setFailed"); | ||||
|     const restoreCacheMock = jest.spyOn(cache, "restoreCache"); | ||||
|     await run(new StateProvider()); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|     expect(failedMock).toHaveBeenCalledWith( | ||||
|         `Key Validation Error: ${key} cannot be larger than 512 characters.` | ||||
|     ); | ||||
| @@ -203,14 +186,13 @@ test("restore with invalid key should fail", async () => { | ||||
|     const key = "comma,comma"; | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|     const failedMock = jest.spyOn(core, "setFailed"); | ||||
|     const restoreCacheMock = jest.spyOn(cache, "restoreCache"); | ||||
|     await run(new StateProvider()); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|     expect(failedMock).toHaveBeenCalledWith( | ||||
|         `Key Validation Error: ${key} cannot contain commas.` | ||||
|     ); | ||||
| @@ -221,8 +203,7 @@ test("restore with no cache found", async () => { | ||||
|     const key = "node-test"; | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -237,7 +218,7 @@ test("restore with no cache found", async () => { | ||||
|     await run(new StateProvider()); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(failedMock).toHaveBeenCalledTimes(0); | ||||
| @@ -254,8 +235,7 @@ test("restore with restore keys and no cache found", async () => { | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         restoreKeys: [restoreKey], | ||||
|         enableCrossOsArchive: false | ||||
|         restoreKeys: [restoreKey] | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -270,13 +250,7 @@ test("restore with restore keys and no cache found", async () => { | ||||
|     await run(new StateProvider()); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith( | ||||
|         [path], | ||||
|         key, | ||||
|         [restoreKey], | ||||
|         {}, | ||||
|         false | ||||
|     ); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(failedMock).toHaveBeenCalledTimes(0); | ||||
| @@ -291,8 +265,7 @@ test("restore with cache found for key", async () => { | ||||
|     const key = "node-test"; | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -308,7 +281,7 @@ test("restore with cache found for key", async () => { | ||||
|     await run(new StateProvider()); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); | ||||
| @@ -325,8 +298,7 @@ test("restore with cache found for restore key", async () => { | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         restoreKeys: [restoreKey], | ||||
|         enableCrossOsArchive: false | ||||
|         restoreKeys: [restoreKey] | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -342,13 +314,7 @@ test("restore with cache found for restore key", async () => { | ||||
|     await run(new StateProvider()); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith( | ||||
|         [path], | ||||
|         key, | ||||
|         [restoreKey], | ||||
|         {}, | ||||
|         false | ||||
|     ); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]); | ||||
|  | ||||
|     expect(stateMock).toHaveBeenCalledWith("CACHE_KEY", key); | ||||
|     expect(setCacheHitOutputMock).toHaveBeenCalledTimes(1); | ||||
|   | ||||
| @@ -27,18 +27,9 @@ beforeAll(() => { | ||||
|             return actualUtils.getInputAsArray(name, options); | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "getInputAsBool").mockImplementation( | ||||
|         (name, options) => { | ||||
|             return jest | ||||
|                 .requireActual("../src/utils/actionUtils") | ||||
|                 .getInputAsBool(name, options); | ||||
|         } | ||||
|     ); | ||||
| }); | ||||
|  | ||||
| beforeEach(() => { | ||||
|     jest.restoreAllMocks(); | ||||
|     process.env[Events.Key] = Events.Push; | ||||
|     process.env[RefKey] = "refs/heads/feature-branch"; | ||||
|  | ||||
| @@ -59,8 +50,7 @@ test("restore with no cache found", async () => { | ||||
|     const key = "node-test"; | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -75,7 +65,7 @@ test("restore with no cache found", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|  | ||||
|     expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); | ||||
|     expect(outputMock).toHaveBeenCalledTimes(1); | ||||
| @@ -93,8 +83,7 @@ test("restore with restore keys and no cache found", async () => { | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         restoreKeys: [restoreKey], | ||||
|         enableCrossOsArchive: false | ||||
|         restoreKeys: [restoreKey] | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -109,13 +98,7 @@ test("restore with restore keys and no cache found", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith( | ||||
|         [path], | ||||
|         key, | ||||
|         [restoreKey], | ||||
|         {}, | ||||
|         false | ||||
|     ); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]); | ||||
|  | ||||
|     expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); | ||||
|     expect(failedMock).toHaveBeenCalledTimes(0); | ||||
| @@ -130,8 +113,7 @@ test("restore with cache found for key", async () => { | ||||
|     const key = "node-test"; | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         enableCrossOsArchive: false | ||||
|         key | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -146,7 +128,7 @@ test("restore with cache found for key", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [], {}, false); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, []); | ||||
|  | ||||
|     expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); | ||||
|     expect(outputMock).toHaveBeenCalledWith("cache-hit", "true"); | ||||
| @@ -165,8 +147,7 @@ test("restore with cache found for restore key", async () => { | ||||
|     testUtils.setInputs({ | ||||
|         path: path, | ||||
|         key, | ||||
|         restoreKeys: [restoreKey], | ||||
|         enableCrossOsArchive: false | ||||
|         restoreKeys: [restoreKey] | ||||
|     }); | ||||
|  | ||||
|     const infoMock = jest.spyOn(core, "info"); | ||||
| @@ -181,13 +162,7 @@ test("restore with cache found for restore key", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(restoreCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith( | ||||
|         [path], | ||||
|         key, | ||||
|         [restoreKey], | ||||
|         {}, | ||||
|         false | ||||
|     ); | ||||
|     expect(restoreCacheMock).toHaveBeenCalledWith([path], key, [restoreKey]); | ||||
|  | ||||
|     expect(outputMock).toHaveBeenCalledWith("cache-primary-key", key); | ||||
|     expect(outputMock).toHaveBeenCalledWith("cache-hit", "false"); | ||||
|   | ||||
| @@ -35,14 +35,6 @@ beforeAll(() => { | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "getInputAsBool").mockImplementation( | ||||
|         (name, options) => { | ||||
|             return jest | ||||
|                 .requireActual("../src/utils/actionUtils") | ||||
|                 .getInputAsBool(name, options); | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation( | ||||
|         (key, cacheResult) => { | ||||
|             return jest | ||||
| @@ -103,14 +95,9 @@ test("save with valid inputs uploads a cache", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(saveCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith( | ||||
|         [inputPath], | ||||
|         primaryKey, | ||||
|         { | ||||
|             uploadChunkSize: 4000000 | ||||
|         }, | ||||
|         false | ||||
|     ); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, { | ||||
|         uploadChunkSize: 4000000 | ||||
|     }); | ||||
|  | ||||
|     expect(failedMock).toHaveBeenCalledTimes(0); | ||||
| }); | ||||
|   | ||||
| @@ -32,14 +32,6 @@ beforeAll(() => { | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "getInputAsBool").mockImplementation( | ||||
|         (name, options) => { | ||||
|             return jest | ||||
|                 .requireActual("../src/utils/actionUtils") | ||||
|                 .getInputAsBool(name, options); | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation( | ||||
|         (key, cacheResult) => { | ||||
|             return jest | ||||
| @@ -55,7 +47,6 @@ beforeAll(() => { | ||||
| }); | ||||
|  | ||||
| beforeEach(() => { | ||||
|     jest.restoreAllMocks(); | ||||
|     process.env[Events.Key] = Events.Push; | ||||
|     process.env[RefKey] = "refs/heads/feature-branch"; | ||||
|  | ||||
| @@ -164,14 +155,9 @@ test("save on GHES with AC available", async () => { | ||||
|     await run(new StateProvider()); | ||||
|  | ||||
|     expect(saveCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith( | ||||
|         [inputPath], | ||||
|         primaryKey, | ||||
|         { | ||||
|             uploadChunkSize: 4000000 | ||||
|         }, | ||||
|         false | ||||
|     ); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, { | ||||
|         uploadChunkSize: 4000000 | ||||
|     }); | ||||
|  | ||||
|     expect(failedMock).toHaveBeenCalledTimes(0); | ||||
| }); | ||||
| @@ -265,8 +251,7 @@ test("save with large cache outputs warning", async () => { | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith( | ||||
|         [inputPath], | ||||
|         primaryKey, | ||||
|         expect.anything(), | ||||
|         false | ||||
|         expect.anything() | ||||
|     ); | ||||
|  | ||||
|     expect(logWarningMock).toHaveBeenCalledTimes(1); | ||||
| @@ -312,8 +297,7 @@ test("save with reserve cache failure outputs warning", async () => { | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith( | ||||
|         [inputPath], | ||||
|         primaryKey, | ||||
|         expect.anything(), | ||||
|         false | ||||
|         expect.anything() | ||||
|     ); | ||||
|  | ||||
|     expect(logWarningMock).toHaveBeenCalledWith( | ||||
| @@ -355,8 +339,7 @@ test("save with server error outputs warning", async () => { | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith( | ||||
|         [inputPath], | ||||
|         primaryKey, | ||||
|         expect.anything(), | ||||
|         false | ||||
|         expect.anything() | ||||
|     ); | ||||
|  | ||||
|     expect(logWarningMock).toHaveBeenCalledTimes(1); | ||||
| @@ -395,14 +378,9 @@ test("save with valid inputs uploads a cache", async () => { | ||||
|     await run(new StateProvider()); | ||||
|  | ||||
|     expect(saveCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith( | ||||
|         [inputPath], | ||||
|         primaryKey, | ||||
|         { | ||||
|             uploadChunkSize: 4000000 | ||||
|         }, | ||||
|         false | ||||
|     ); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, { | ||||
|         uploadChunkSize: 4000000 | ||||
|     }); | ||||
|  | ||||
|     expect(failedMock).toHaveBeenCalledTimes(0); | ||||
| }); | ||||
|   | ||||
| @@ -35,14 +35,6 @@ beforeAll(() => { | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "getInputAsBool").mockImplementation( | ||||
|         (name, options) => { | ||||
|             return jest | ||||
|                 .requireActual("../src/utils/actionUtils") | ||||
|                 .getInputAsBool(name, options); | ||||
|         } | ||||
|     ); | ||||
|  | ||||
|     jest.spyOn(actionUtils, "isExactKeyMatch").mockImplementation( | ||||
|         (key, cacheResult) => { | ||||
|             return jest | ||||
| @@ -93,14 +85,9 @@ test("save with valid inputs uploads a cache", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(saveCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith( | ||||
|         [inputPath], | ||||
|         primaryKey, | ||||
|         { | ||||
|             uploadChunkSize: 4000000 | ||||
|         }, | ||||
|         false | ||||
|     ); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, { | ||||
|         uploadChunkSize: 4000000 | ||||
|     }); | ||||
|  | ||||
|     expect(failedMock).toHaveBeenCalledTimes(0); | ||||
| }); | ||||
| @@ -125,14 +112,9 @@ test("save failing logs the warning message", async () => { | ||||
|     await run(); | ||||
|  | ||||
|     expect(saveCacheMock).toHaveBeenCalledTimes(1); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith( | ||||
|         [inputPath], | ||||
|         primaryKey, | ||||
|         { | ||||
|             uploadChunkSize: 4000000 | ||||
|         }, | ||||
|         false | ||||
|     ); | ||||
|     expect(saveCacheMock).toHaveBeenCalledWith([inputPath], primaryKey, { | ||||
|         uploadChunkSize: 4000000 | ||||
|     }); | ||||
|  | ||||
|     expect(warningMock).toHaveBeenCalledTimes(1); | ||||
|     expect(warningMock).toHaveBeenCalledWith("Cache save failed."); | ||||
|   | ||||
| @@ -14,10 +14,6 @@ inputs: | ||||
|   upload-chunk-size: | ||||
|     description: 'The chunk size used to split up large files during upload, in bytes' | ||||
|     required: false | ||||
|   enableCrossOsArchive: | ||||
|     description: 'An optional boolean when enabled, allows windows runners to save or restore caches that can be restored or saved respectively on other platforms' | ||||
|     default: 'false' | ||||
|     required: false | ||||
| outputs: | ||||
|   cache-hit: | ||||
|     description: 'A boolean value to indicate an exact match was found for the primary key' | ||||
|   | ||||
							
								
								
									
										69
									
								
								dist/restore-only/index.js
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										69
									
								
								dist/restore-only/index.js
									
									
									
									
										vendored
									
									
								
							| @@ -3383,6 +3383,7 @@ const crypto = __importStar(__webpack_require__(417)); | ||||
| const fs = __importStar(__webpack_require__(747)); | ||||
| const url_1 = __webpack_require__(414); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| const downloadUtils_1 = __webpack_require__(251); | ||||
| const options_1 = __webpack_require__(538); | ||||
| const requestUtils_1 = __webpack_require__(899); | ||||
| @@ -3412,17 +3413,10 @@ function createHttpClient() { | ||||
|     const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token); | ||||
|     return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions()); | ||||
| } | ||||
| function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) { | ||||
|     const components = paths; | ||||
|     // Add compression method to cache version to restore
 | ||||
|     // compressed cache as per compression method
 | ||||
|     if (compressionMethod) { | ||||
|         components.push(compressionMethod); | ||||
|     } | ||||
|     // Only check for windows platforms if enableCrossOsArchive is false
 | ||||
|     if (process.platform === 'win32' && !enableCrossOsArchive) { | ||||
|         components.push('windows-only'); | ||||
|     } | ||||
| function getCacheVersion(paths, compressionMethod) { | ||||
|     const components = paths.concat(!compressionMethod || compressionMethod === constants_1.CompressionMethod.Gzip | ||||
|         ? [] | ||||
|         : [compressionMethod]); | ||||
|     // Add salt to cache version to support breaking changes in cache entry
 | ||||
|     components.push(versionSalt); | ||||
|     return crypto | ||||
| @@ -3434,7 +3428,7 @@ exports.getCacheVersion = getCacheVersion; | ||||
| function getCacheEntry(keys, paths, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         const httpClient = createHttpClient(); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod); | ||||
|         const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`; | ||||
|         const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); })); | ||||
|         // Cache not found
 | ||||
| @@ -3497,7 +3491,7 @@ exports.downloadCache = downloadCache; | ||||
| function reserveCache(key, paths, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         const httpClient = createHttpClient(); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod); | ||||
|         const reserveCacheRequest = { | ||||
|             key, | ||||
|             version, | ||||
| @@ -4977,8 +4971,7 @@ var Inputs; | ||||
|     Inputs["Key"] = "key"; | ||||
|     Inputs["Path"] = "path"; | ||||
|     Inputs["RestoreKeys"] = "restore-keys"; | ||||
|     Inputs["UploadChunkSize"] = "upload-chunk-size"; | ||||
|     Inputs["EnableCrossOsArchive"] = "enableCrossOsArchive"; // Input for cache, restore, save action
 | ||||
|     Inputs["UploadChunkSize"] = "upload-chunk-size"; // Input for cache, save action
 | ||||
| })(Inputs = exports.Inputs || (exports.Inputs = {})); | ||||
| var Outputs; | ||||
| (function (Outputs) { | ||||
| @@ -10074,7 +10067,7 @@ var __importStar = (this && this.__importStar) || function (mod) { | ||||
|     return result; | ||||
| }; | ||||
| Object.defineProperty(exports, "__esModule", { value: true }); | ||||
| exports.isCacheFeatureAvailable = exports.getInputAsBool = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; | ||||
| exports.isCacheFeatureAvailable = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; | ||||
| const cache = __importStar(__webpack_require__(692)); | ||||
| const core = __importStar(__webpack_require__(470)); | ||||
| const constants_1 = __webpack_require__(196); | ||||
| @@ -10117,11 +10110,6 @@ function getInputAsInt(name, options) { | ||||
|     return value; | ||||
| } | ||||
| exports.getInputAsInt = getInputAsInt; | ||||
| function getInputAsBool(name, options) { | ||||
|     const result = core.getInput(name, options); | ||||
|     return result.toLowerCase() === "true"; | ||||
| } | ||||
| exports.getInputAsBool = getInputAsBool; | ||||
| function isCacheFeatureAvailable() { | ||||
|     if (cache.isFeatureAvailable()) { | ||||
|         return true; | ||||
| @@ -38229,14 +38217,12 @@ var __importStar = (this && this.__importStar) || function (mod) { | ||||
| }; | ||||
| Object.defineProperty(exports, "__esModule", { value: true }); | ||||
| const exec_1 = __webpack_require__(986); | ||||
| const core_1 = __webpack_require__(470); | ||||
| const io = __importStar(__webpack_require__(1)); | ||||
| const fs_1 = __webpack_require__(747); | ||||
| const path = __importStar(__webpack_require__(622)); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| const IS_WINDOWS = process.platform === 'win32'; | ||||
| core_1.exportVariable('MSYS', 'winsymlinks:nativestrict'); | ||||
| // Returns tar path and type: BSD or GNU
 | ||||
| function getTarPath() { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
| @@ -47209,6 +47195,7 @@ const path = __importStar(__webpack_require__(622)); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const cacheHttpClient = __importStar(__webpack_require__(114)); | ||||
| const tar_1 = __webpack_require__(434); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| class ValidationError extends Error { | ||||
|     constructor(message) { | ||||
|         super(message); | ||||
| @@ -47255,10 +47242,9 @@ exports.isFeatureAvailable = isFeatureAvailable; | ||||
|  * @param primaryKey an explicit key for restoring the cache | ||||
|  * @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key | ||||
|  * @param downloadOptions cache download options | ||||
|  * @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform | ||||
|  * @returns string returns the key for the cache hit, otherwise returns undefined | ||||
|  */ | ||||
| function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArchive = false) { | ||||
| function restoreCache(paths, primaryKey, restoreKeys, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         checkPaths(paths); | ||||
|         restoreKeys = restoreKeys || []; | ||||
| @@ -47271,17 +47257,31 @@ function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArch | ||||
|         for (const key of keys) { | ||||
|             checkKey(key); | ||||
|         } | ||||
|         const compressionMethod = yield utils.getCompressionMethod(); | ||||
|         let cacheEntry; | ||||
|         let compressionMethod = yield utils.getCompressionMethod(); | ||||
|         let archivePath = ''; | ||||
|         try { | ||||
|             // path are needed to compute version
 | ||||
|             const cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                 compressionMethod, | ||||
|                 enableCrossOsArchive | ||||
|             cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                 compressionMethod | ||||
|             }); | ||||
|             if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { | ||||
|                 // Cache not found
 | ||||
|                 return undefined; | ||||
|                 // This is to support the old cache entry created by gzip on windows.
 | ||||
|                 if (process.platform === 'win32' && | ||||
|                     compressionMethod !== constants_1.CompressionMethod.Gzip) { | ||||
|                     compressionMethod = constants_1.CompressionMethod.Gzip; | ||||
|                     cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                         compressionMethod | ||||
|                     }); | ||||
|                     if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { | ||||
|                         return undefined; | ||||
|                     } | ||||
|                     core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression."); | ||||
|                 } | ||||
|                 else { | ||||
|                     // Cache not found
 | ||||
|                     return undefined; | ||||
|                 } | ||||
|             } | ||||
|             archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod)); | ||||
|             core.debug(`Archive Path: ${archivePath}`); | ||||
| @@ -47324,11 +47324,10 @@ exports.restoreCache = restoreCache; | ||||
|  * | ||||
|  * @param paths a list of file paths to be cached | ||||
|  * @param key an explicit key for restoring the cache | ||||
|  * @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform | ||||
|  * @param options cache upload options | ||||
|  * @returns number returns cacheId if the cache was saved successfully and throws an error if save fails | ||||
|  */ | ||||
| function saveCache(paths, key, options, enableCrossOsArchive = false) { | ||||
| function saveCache(paths, key, options) { | ||||
|     var _a, _b, _c, _d, _e; | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         checkPaths(paths); | ||||
| @@ -47359,7 +47358,6 @@ function saveCache(paths, key, options, enableCrossOsArchive = false) { | ||||
|             core.debug('Reserving Cache'); | ||||
|             const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, { | ||||
|                 compressionMethod, | ||||
|                 enableCrossOsArchive, | ||||
|                 cacheSize: archiveFileSize | ||||
|             }); | ||||
|             if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) { | ||||
| @@ -50494,8 +50492,7 @@ function restoreImpl(stateProvider) { | ||||
|             const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, { | ||||
|                 required: true | ||||
|             }); | ||||
|             const enableCrossOsArchive = utils.getInputAsBool(constants_1.Inputs.EnableCrossOsArchive); | ||||
|             const cacheKey = yield cache.restoreCache(cachePaths, primaryKey, restoreKeys, {}, enableCrossOsArchive); | ||||
|             const cacheKey = yield cache.restoreCache(cachePaths, primaryKey, restoreKeys); | ||||
|             if (!cacheKey) { | ||||
|                 core.info(`Cache not found for input keys: ${[ | ||||
|                     primaryKey, | ||||
|   | ||||
							
								
								
									
										69
									
								
								dist/restore/index.js
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										69
									
								
								dist/restore/index.js
									
									
									
									
										vendored
									
									
								
							| @@ -3383,6 +3383,7 @@ const crypto = __importStar(__webpack_require__(417)); | ||||
| const fs = __importStar(__webpack_require__(747)); | ||||
| const url_1 = __webpack_require__(414); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| const downloadUtils_1 = __webpack_require__(251); | ||||
| const options_1 = __webpack_require__(538); | ||||
| const requestUtils_1 = __webpack_require__(899); | ||||
| @@ -3412,17 +3413,10 @@ function createHttpClient() { | ||||
|     const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token); | ||||
|     return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions()); | ||||
| } | ||||
| function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) { | ||||
|     const components = paths; | ||||
|     // Add compression method to cache version to restore
 | ||||
|     // compressed cache as per compression method
 | ||||
|     if (compressionMethod) { | ||||
|         components.push(compressionMethod); | ||||
|     } | ||||
|     // Only check for windows platforms if enableCrossOsArchive is false
 | ||||
|     if (process.platform === 'win32' && !enableCrossOsArchive) { | ||||
|         components.push('windows-only'); | ||||
|     } | ||||
| function getCacheVersion(paths, compressionMethod) { | ||||
|     const components = paths.concat(!compressionMethod || compressionMethod === constants_1.CompressionMethod.Gzip | ||||
|         ? [] | ||||
|         : [compressionMethod]); | ||||
|     // Add salt to cache version to support breaking changes in cache entry
 | ||||
|     components.push(versionSalt); | ||||
|     return crypto | ||||
| @@ -3434,7 +3428,7 @@ exports.getCacheVersion = getCacheVersion; | ||||
| function getCacheEntry(keys, paths, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         const httpClient = createHttpClient(); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod); | ||||
|         const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`; | ||||
|         const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); })); | ||||
|         // Cache not found
 | ||||
| @@ -3497,7 +3491,7 @@ exports.downloadCache = downloadCache; | ||||
| function reserveCache(key, paths, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         const httpClient = createHttpClient(); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod); | ||||
|         const reserveCacheRequest = { | ||||
|             key, | ||||
|             version, | ||||
| @@ -4977,8 +4971,7 @@ var Inputs; | ||||
|     Inputs["Key"] = "key"; | ||||
|     Inputs["Path"] = "path"; | ||||
|     Inputs["RestoreKeys"] = "restore-keys"; | ||||
|     Inputs["UploadChunkSize"] = "upload-chunk-size"; | ||||
|     Inputs["EnableCrossOsArchive"] = "enableCrossOsArchive"; // Input for cache, restore, save action
 | ||||
|     Inputs["UploadChunkSize"] = "upload-chunk-size"; // Input for cache, save action
 | ||||
| })(Inputs = exports.Inputs || (exports.Inputs = {})); | ||||
| var Outputs; | ||||
| (function (Outputs) { | ||||
| @@ -38137,14 +38130,12 @@ var __importStar = (this && this.__importStar) || function (mod) { | ||||
| }; | ||||
| Object.defineProperty(exports, "__esModule", { value: true }); | ||||
| const exec_1 = __webpack_require__(986); | ||||
| const core_1 = __webpack_require__(470); | ||||
| const io = __importStar(__webpack_require__(1)); | ||||
| const fs_1 = __webpack_require__(747); | ||||
| const path = __importStar(__webpack_require__(622)); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| const IS_WINDOWS = process.platform === 'win32'; | ||||
| core_1.exportVariable('MSYS', 'winsymlinks:nativestrict'); | ||||
| // Returns tar path and type: BSD or GNU
 | ||||
| function getTarPath() { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
| @@ -38602,7 +38593,7 @@ var __importStar = (this && this.__importStar) || function (mod) { | ||||
|     return result; | ||||
| }; | ||||
| Object.defineProperty(exports, "__esModule", { value: true }); | ||||
| exports.isCacheFeatureAvailable = exports.getInputAsBool = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; | ||||
| exports.isCacheFeatureAvailable = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; | ||||
| const cache = __importStar(__webpack_require__(692)); | ||||
| const core = __importStar(__webpack_require__(470)); | ||||
| const constants_1 = __webpack_require__(196); | ||||
| @@ -38645,11 +38636,6 @@ function getInputAsInt(name, options) { | ||||
|     return value; | ||||
| } | ||||
| exports.getInputAsInt = getInputAsInt; | ||||
| function getInputAsBool(name, options) { | ||||
|     const result = core.getInput(name, options); | ||||
|     return result.toLowerCase() === "true"; | ||||
| } | ||||
| exports.getInputAsBool = getInputAsBool; | ||||
| function isCacheFeatureAvailable() { | ||||
|     if (cache.isFeatureAvailable()) { | ||||
|         return true; | ||||
| @@ -47180,6 +47166,7 @@ const path = __importStar(__webpack_require__(622)); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const cacheHttpClient = __importStar(__webpack_require__(114)); | ||||
| const tar_1 = __webpack_require__(434); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| class ValidationError extends Error { | ||||
|     constructor(message) { | ||||
|         super(message); | ||||
| @@ -47226,10 +47213,9 @@ exports.isFeatureAvailable = isFeatureAvailable; | ||||
|  * @param primaryKey an explicit key for restoring the cache | ||||
|  * @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key | ||||
|  * @param downloadOptions cache download options | ||||
|  * @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform | ||||
|  * @returns string returns the key for the cache hit, otherwise returns undefined | ||||
|  */ | ||||
| function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArchive = false) { | ||||
| function restoreCache(paths, primaryKey, restoreKeys, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         checkPaths(paths); | ||||
|         restoreKeys = restoreKeys || []; | ||||
| @@ -47242,17 +47228,31 @@ function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArch | ||||
|         for (const key of keys) { | ||||
|             checkKey(key); | ||||
|         } | ||||
|         const compressionMethod = yield utils.getCompressionMethod(); | ||||
|         let cacheEntry; | ||||
|         let compressionMethod = yield utils.getCompressionMethod(); | ||||
|         let archivePath = ''; | ||||
|         try { | ||||
|             // path are needed to compute version
 | ||||
|             const cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                 compressionMethod, | ||||
|                 enableCrossOsArchive | ||||
|             cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                 compressionMethod | ||||
|             }); | ||||
|             if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { | ||||
|                 // Cache not found
 | ||||
|                 return undefined; | ||||
|                 // This is to support the old cache entry created by gzip on windows.
 | ||||
|                 if (process.platform === 'win32' && | ||||
|                     compressionMethod !== constants_1.CompressionMethod.Gzip) { | ||||
|                     compressionMethod = constants_1.CompressionMethod.Gzip; | ||||
|                     cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                         compressionMethod | ||||
|                     }); | ||||
|                     if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { | ||||
|                         return undefined; | ||||
|                     } | ||||
|                     core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression."); | ||||
|                 } | ||||
|                 else { | ||||
|                     // Cache not found
 | ||||
|                     return undefined; | ||||
|                 } | ||||
|             } | ||||
|             archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod)); | ||||
|             core.debug(`Archive Path: ${archivePath}`); | ||||
| @@ -47295,11 +47295,10 @@ exports.restoreCache = restoreCache; | ||||
|  * | ||||
|  * @param paths a list of file paths to be cached | ||||
|  * @param key an explicit key for restoring the cache | ||||
|  * @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform | ||||
|  * @param options cache upload options | ||||
|  * @returns number returns cacheId if the cache was saved successfully and throws an error if save fails | ||||
|  */ | ||||
| function saveCache(paths, key, options, enableCrossOsArchive = false) { | ||||
| function saveCache(paths, key, options) { | ||||
|     var _a, _b, _c, _d, _e; | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         checkPaths(paths); | ||||
| @@ -47330,7 +47329,6 @@ function saveCache(paths, key, options, enableCrossOsArchive = false) { | ||||
|             core.debug('Reserving Cache'); | ||||
|             const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, { | ||||
|                 compressionMethod, | ||||
|                 enableCrossOsArchive, | ||||
|                 cacheSize: archiveFileSize | ||||
|             }); | ||||
|             if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) { | ||||
| @@ -50494,8 +50492,7 @@ function restoreImpl(stateProvider) { | ||||
|             const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, { | ||||
|                 required: true | ||||
|             }); | ||||
|             const enableCrossOsArchive = utils.getInputAsBool(constants_1.Inputs.EnableCrossOsArchive); | ||||
|             const cacheKey = yield cache.restoreCache(cachePaths, primaryKey, restoreKeys, {}, enableCrossOsArchive); | ||||
|             const cacheKey = yield cache.restoreCache(cachePaths, primaryKey, restoreKeys); | ||||
|             if (!cacheKey) { | ||||
|                 core.info(`Cache not found for input keys: ${[ | ||||
|                     primaryKey, | ||||
|   | ||||
							
								
								
									
										71
									
								
								dist/save-only/index.js
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										71
									
								
								dist/save-only/index.js
									
									
									
									
										vendored
									
									
								
							| @@ -3439,6 +3439,7 @@ const crypto = __importStar(__webpack_require__(417)); | ||||
| const fs = __importStar(__webpack_require__(747)); | ||||
| const url_1 = __webpack_require__(835); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| const downloadUtils_1 = __webpack_require__(251); | ||||
| const options_1 = __webpack_require__(538); | ||||
| const requestUtils_1 = __webpack_require__(899); | ||||
| @@ -3468,17 +3469,10 @@ function createHttpClient() { | ||||
|     const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token); | ||||
|     return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions()); | ||||
| } | ||||
| function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) { | ||||
|     const components = paths; | ||||
|     // Add compression method to cache version to restore
 | ||||
|     // compressed cache as per compression method
 | ||||
|     if (compressionMethod) { | ||||
|         components.push(compressionMethod); | ||||
|     } | ||||
|     // Only check for windows platforms if enableCrossOsArchive is false
 | ||||
|     if (process.platform === 'win32' && !enableCrossOsArchive) { | ||||
|         components.push('windows-only'); | ||||
|     } | ||||
| function getCacheVersion(paths, compressionMethod) { | ||||
|     const components = paths.concat(!compressionMethod || compressionMethod === constants_1.CompressionMethod.Gzip | ||||
|         ? [] | ||||
|         : [compressionMethod]); | ||||
|     // Add salt to cache version to support breaking changes in cache entry
 | ||||
|     components.push(versionSalt); | ||||
|     return crypto | ||||
| @@ -3490,7 +3484,7 @@ exports.getCacheVersion = getCacheVersion; | ||||
| function getCacheEntry(keys, paths, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         const httpClient = createHttpClient(); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod); | ||||
|         const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`; | ||||
|         const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); })); | ||||
|         // Cache not found
 | ||||
| @@ -3553,7 +3547,7 @@ exports.downloadCache = downloadCache; | ||||
| function reserveCache(key, paths, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         const httpClient = createHttpClient(); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod); | ||||
|         const reserveCacheRequest = { | ||||
|             key, | ||||
|             version, | ||||
| @@ -5033,8 +5027,7 @@ var Inputs; | ||||
|     Inputs["Key"] = "key"; | ||||
|     Inputs["Path"] = "path"; | ||||
|     Inputs["RestoreKeys"] = "restore-keys"; | ||||
|     Inputs["UploadChunkSize"] = "upload-chunk-size"; | ||||
|     Inputs["EnableCrossOsArchive"] = "enableCrossOsArchive"; // Input for cache, restore, save action
 | ||||
|     Inputs["UploadChunkSize"] = "upload-chunk-size"; // Input for cache, save action
 | ||||
| })(Inputs = exports.Inputs || (exports.Inputs = {})); | ||||
| var Outputs; | ||||
| (function (Outputs) { | ||||
| @@ -38188,14 +38181,12 @@ var __importStar = (this && this.__importStar) || function (mod) { | ||||
| }; | ||||
| Object.defineProperty(exports, "__esModule", { value: true }); | ||||
| const exec_1 = __webpack_require__(986); | ||||
| const core_1 = __webpack_require__(470); | ||||
| const io = __importStar(__webpack_require__(1)); | ||||
| const fs_1 = __webpack_require__(747); | ||||
| const path = __importStar(__webpack_require__(622)); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| const IS_WINDOWS = process.platform === 'win32'; | ||||
| core_1.exportVariable('MSYS', 'winsymlinks:nativestrict'); | ||||
| // Returns tar path and type: BSD or GNU
 | ||||
| function getTarPath() { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
| @@ -38653,7 +38644,7 @@ var __importStar = (this && this.__importStar) || function (mod) { | ||||
|     return result; | ||||
| }; | ||||
| Object.defineProperty(exports, "__esModule", { value: true }); | ||||
| exports.isCacheFeatureAvailable = exports.getInputAsBool = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; | ||||
| exports.isCacheFeatureAvailable = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; | ||||
| const cache = __importStar(__webpack_require__(692)); | ||||
| const core = __importStar(__webpack_require__(470)); | ||||
| const constants_1 = __webpack_require__(196); | ||||
| @@ -38696,11 +38687,6 @@ function getInputAsInt(name, options) { | ||||
|     return value; | ||||
| } | ||||
| exports.getInputAsInt = getInputAsInt; | ||||
| function getInputAsBool(name, options) { | ||||
|     const result = core.getInput(name, options); | ||||
|     return result.toLowerCase() === "true"; | ||||
| } | ||||
| exports.getInputAsBool = getInputAsBool; | ||||
| function isCacheFeatureAvailable() { | ||||
|     if (cache.isFeatureAvailable()) { | ||||
|         return true; | ||||
| @@ -41180,8 +41166,9 @@ function saveImpl(stateProvider) { | ||||
|             const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, { | ||||
|                 required: true | ||||
|             }); | ||||
|             const enableCrossOsArchive = utils.getInputAsBool(constants_1.Inputs.EnableCrossOsArchive); | ||||
|             cacheId = yield cache.saveCache(cachePaths, primaryKey, { uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize) }, enableCrossOsArchive); | ||||
|             cacheId = yield cache.saveCache(cachePaths, primaryKey, { | ||||
|                 uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize) | ||||
|             }); | ||||
|             if (cacheId != -1) { | ||||
|                 core.info(`Cache saved with key: ${primaryKey}`); | ||||
|             } | ||||
| @@ -47321,6 +47308,7 @@ const path = __importStar(__webpack_require__(622)); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const cacheHttpClient = __importStar(__webpack_require__(114)); | ||||
| const tar_1 = __webpack_require__(434); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| class ValidationError extends Error { | ||||
|     constructor(message) { | ||||
|         super(message); | ||||
| @@ -47367,10 +47355,9 @@ exports.isFeatureAvailable = isFeatureAvailable; | ||||
|  * @param primaryKey an explicit key for restoring the cache | ||||
|  * @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key | ||||
|  * @param downloadOptions cache download options | ||||
|  * @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform | ||||
|  * @returns string returns the key for the cache hit, otherwise returns undefined | ||||
|  */ | ||||
| function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArchive = false) { | ||||
| function restoreCache(paths, primaryKey, restoreKeys, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         checkPaths(paths); | ||||
|         restoreKeys = restoreKeys || []; | ||||
| @@ -47383,17 +47370,31 @@ function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArch | ||||
|         for (const key of keys) { | ||||
|             checkKey(key); | ||||
|         } | ||||
|         const compressionMethod = yield utils.getCompressionMethod(); | ||||
|         let cacheEntry; | ||||
|         let compressionMethod = yield utils.getCompressionMethod(); | ||||
|         let archivePath = ''; | ||||
|         try { | ||||
|             // path are needed to compute version
 | ||||
|             const cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                 compressionMethod, | ||||
|                 enableCrossOsArchive | ||||
|             cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                 compressionMethod | ||||
|             }); | ||||
|             if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { | ||||
|                 // Cache not found
 | ||||
|                 return undefined; | ||||
|                 // This is to support the old cache entry created by gzip on windows.
 | ||||
|                 if (process.platform === 'win32' && | ||||
|                     compressionMethod !== constants_1.CompressionMethod.Gzip) { | ||||
|                     compressionMethod = constants_1.CompressionMethod.Gzip; | ||||
|                     cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                         compressionMethod | ||||
|                     }); | ||||
|                     if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { | ||||
|                         return undefined; | ||||
|                     } | ||||
|                     core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression."); | ||||
|                 } | ||||
|                 else { | ||||
|                     // Cache not found
 | ||||
|                     return undefined; | ||||
|                 } | ||||
|             } | ||||
|             archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod)); | ||||
|             core.debug(`Archive Path: ${archivePath}`); | ||||
| @@ -47436,11 +47437,10 @@ exports.restoreCache = restoreCache; | ||||
|  * | ||||
|  * @param paths a list of file paths to be cached | ||||
|  * @param key an explicit key for restoring the cache | ||||
|  * @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform | ||||
|  * @param options cache upload options | ||||
|  * @returns number returns cacheId if the cache was saved successfully and throws an error if save fails | ||||
|  */ | ||||
| function saveCache(paths, key, options, enableCrossOsArchive = false) { | ||||
| function saveCache(paths, key, options) { | ||||
|     var _a, _b, _c, _d, _e; | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         checkPaths(paths); | ||||
| @@ -47471,7 +47471,6 @@ function saveCache(paths, key, options, enableCrossOsArchive = false) { | ||||
|             core.debug('Reserving Cache'); | ||||
|             const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, { | ||||
|                 compressionMethod, | ||||
|                 enableCrossOsArchive, | ||||
|                 cacheSize: archiveFileSize | ||||
|             }); | ||||
|             if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) { | ||||
|   | ||||
							
								
								
									
										71
									
								
								dist/save/index.js
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										71
									
								
								dist/save/index.js
									
									
									
									
										vendored
									
									
								
							| @@ -3383,6 +3383,7 @@ const crypto = __importStar(__webpack_require__(417)); | ||||
| const fs = __importStar(__webpack_require__(747)); | ||||
| const url_1 = __webpack_require__(835); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| const downloadUtils_1 = __webpack_require__(251); | ||||
| const options_1 = __webpack_require__(538); | ||||
| const requestUtils_1 = __webpack_require__(899); | ||||
| @@ -3412,17 +3413,10 @@ function createHttpClient() { | ||||
|     const bearerCredentialHandler = new auth_1.BearerCredentialHandler(token); | ||||
|     return new http_client_1.HttpClient('actions/cache', [bearerCredentialHandler], getRequestOptions()); | ||||
| } | ||||
| function getCacheVersion(paths, compressionMethod, enableCrossOsArchive = false) { | ||||
|     const components = paths; | ||||
|     // Add compression method to cache version to restore
 | ||||
|     // compressed cache as per compression method
 | ||||
|     if (compressionMethod) { | ||||
|         components.push(compressionMethod); | ||||
|     } | ||||
|     // Only check for windows platforms if enableCrossOsArchive is false
 | ||||
|     if (process.platform === 'win32' && !enableCrossOsArchive) { | ||||
|         components.push('windows-only'); | ||||
|     } | ||||
| function getCacheVersion(paths, compressionMethod) { | ||||
|     const components = paths.concat(!compressionMethod || compressionMethod === constants_1.CompressionMethod.Gzip | ||||
|         ? [] | ||||
|         : [compressionMethod]); | ||||
|     // Add salt to cache version to support breaking changes in cache entry
 | ||||
|     components.push(versionSalt); | ||||
|     return crypto | ||||
| @@ -3434,7 +3428,7 @@ exports.getCacheVersion = getCacheVersion; | ||||
| function getCacheEntry(keys, paths, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         const httpClient = createHttpClient(); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod); | ||||
|         const resource = `cache?keys=${encodeURIComponent(keys.join(','))}&version=${version}`; | ||||
|         const response = yield requestUtils_1.retryTypedResponse('getCacheEntry', () => __awaiter(this, void 0, void 0, function* () { return httpClient.getJson(getCacheApiUrl(resource)); })); | ||||
|         // Cache not found
 | ||||
| @@ -3497,7 +3491,7 @@ exports.downloadCache = downloadCache; | ||||
| function reserveCache(key, paths, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         const httpClient = createHttpClient(); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod, options === null || options === void 0 ? void 0 : options.enableCrossOsArchive); | ||||
|         const version = getCacheVersion(paths, options === null || options === void 0 ? void 0 : options.compressionMethod); | ||||
|         const reserveCacheRequest = { | ||||
|             key, | ||||
|             version, | ||||
| @@ -4977,8 +4971,7 @@ var Inputs; | ||||
|     Inputs["Key"] = "key"; | ||||
|     Inputs["Path"] = "path"; | ||||
|     Inputs["RestoreKeys"] = "restore-keys"; | ||||
|     Inputs["UploadChunkSize"] = "upload-chunk-size"; | ||||
|     Inputs["EnableCrossOsArchive"] = "enableCrossOsArchive"; // Input for cache, restore, save action
 | ||||
|     Inputs["UploadChunkSize"] = "upload-chunk-size"; // Input for cache, save action
 | ||||
| })(Inputs = exports.Inputs || (exports.Inputs = {})); | ||||
| var Outputs; | ||||
| (function (Outputs) { | ||||
| @@ -38132,14 +38125,12 @@ var __importStar = (this && this.__importStar) || function (mod) { | ||||
| }; | ||||
| Object.defineProperty(exports, "__esModule", { value: true }); | ||||
| const exec_1 = __webpack_require__(986); | ||||
| const core_1 = __webpack_require__(470); | ||||
| const io = __importStar(__webpack_require__(1)); | ||||
| const fs_1 = __webpack_require__(747); | ||||
| const path = __importStar(__webpack_require__(622)); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| const IS_WINDOWS = process.platform === 'win32'; | ||||
| core_1.exportVariable('MSYS', 'winsymlinks:nativestrict'); | ||||
| // Returns tar path and type: BSD or GNU
 | ||||
| function getTarPath() { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
| @@ -38597,7 +38588,7 @@ var __importStar = (this && this.__importStar) || function (mod) { | ||||
|     return result; | ||||
| }; | ||||
| Object.defineProperty(exports, "__esModule", { value: true }); | ||||
| exports.isCacheFeatureAvailable = exports.getInputAsBool = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; | ||||
| exports.isCacheFeatureAvailable = exports.getInputAsInt = exports.getInputAsArray = exports.isValidEvent = exports.logWarning = exports.isExactKeyMatch = exports.isGhes = void 0; | ||||
| const cache = __importStar(__webpack_require__(692)); | ||||
| const core = __importStar(__webpack_require__(470)); | ||||
| const constants_1 = __webpack_require__(196); | ||||
| @@ -38640,11 +38631,6 @@ function getInputAsInt(name, options) { | ||||
|     return value; | ||||
| } | ||||
| exports.getInputAsInt = getInputAsInt; | ||||
| function getInputAsBool(name, options) { | ||||
|     const result = core.getInput(name, options); | ||||
|     return result.toLowerCase() === "true"; | ||||
| } | ||||
| exports.getInputAsBool = getInputAsBool; | ||||
| function isCacheFeatureAvailable() { | ||||
|     if (cache.isFeatureAvailable()) { | ||||
|         return true; | ||||
| @@ -41124,8 +41110,9 @@ function saveImpl(stateProvider) { | ||||
|             const cachePaths = utils.getInputAsArray(constants_1.Inputs.Path, { | ||||
|                 required: true | ||||
|             }); | ||||
|             const enableCrossOsArchive = utils.getInputAsBool(constants_1.Inputs.EnableCrossOsArchive); | ||||
|             cacheId = yield cache.saveCache(cachePaths, primaryKey, { uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize) }, enableCrossOsArchive); | ||||
|             cacheId = yield cache.saveCache(cachePaths, primaryKey, { | ||||
|                 uploadChunkSize: utils.getInputAsInt(constants_1.Inputs.UploadChunkSize) | ||||
|             }); | ||||
|             if (cacheId != -1) { | ||||
|                 core.info(`Cache saved with key: ${primaryKey}`); | ||||
|             } | ||||
| @@ -47294,6 +47281,7 @@ const path = __importStar(__webpack_require__(622)); | ||||
| const utils = __importStar(__webpack_require__(15)); | ||||
| const cacheHttpClient = __importStar(__webpack_require__(114)); | ||||
| const tar_1 = __webpack_require__(434); | ||||
| const constants_1 = __webpack_require__(931); | ||||
| class ValidationError extends Error { | ||||
|     constructor(message) { | ||||
|         super(message); | ||||
| @@ -47340,10 +47328,9 @@ exports.isFeatureAvailable = isFeatureAvailable; | ||||
|  * @param primaryKey an explicit key for restoring the cache | ||||
|  * @param restoreKeys an optional ordered list of keys to use for restoring the cache if no cache hit occurred for key | ||||
|  * @param downloadOptions cache download options | ||||
|  * @param enableCrossOsArchive an optional boolean enabled to restore on windows any cache created on any platform | ||||
|  * @returns string returns the key for the cache hit, otherwise returns undefined | ||||
|  */ | ||||
| function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArchive = false) { | ||||
| function restoreCache(paths, primaryKey, restoreKeys, options) { | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         checkPaths(paths); | ||||
|         restoreKeys = restoreKeys || []; | ||||
| @@ -47356,17 +47343,31 @@ function restoreCache(paths, primaryKey, restoreKeys, options, enableCrossOsArch | ||||
|         for (const key of keys) { | ||||
|             checkKey(key); | ||||
|         } | ||||
|         const compressionMethod = yield utils.getCompressionMethod(); | ||||
|         let cacheEntry; | ||||
|         let compressionMethod = yield utils.getCompressionMethod(); | ||||
|         let archivePath = ''; | ||||
|         try { | ||||
|             // path are needed to compute version
 | ||||
|             const cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                 compressionMethod, | ||||
|                 enableCrossOsArchive | ||||
|             cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                 compressionMethod | ||||
|             }); | ||||
|             if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { | ||||
|                 // Cache not found
 | ||||
|                 return undefined; | ||||
|                 // This is to support the old cache entry created by gzip on windows.
 | ||||
|                 if (process.platform === 'win32' && | ||||
|                     compressionMethod !== constants_1.CompressionMethod.Gzip) { | ||||
|                     compressionMethod = constants_1.CompressionMethod.Gzip; | ||||
|                     cacheEntry = yield cacheHttpClient.getCacheEntry(keys, paths, { | ||||
|                         compressionMethod | ||||
|                     }); | ||||
|                     if (!(cacheEntry === null || cacheEntry === void 0 ? void 0 : cacheEntry.archiveLocation)) { | ||||
|                         return undefined; | ||||
|                     } | ||||
|                     core.info("Couldn't find cache entry with zstd compression, falling back to gzip compression."); | ||||
|                 } | ||||
|                 else { | ||||
|                     // Cache not found
 | ||||
|                     return undefined; | ||||
|                 } | ||||
|             } | ||||
|             archivePath = path.join(yield utils.createTempDirectory(), utils.getCacheFileName(compressionMethod)); | ||||
|             core.debug(`Archive Path: ${archivePath}`); | ||||
| @@ -47409,11 +47410,10 @@ exports.restoreCache = restoreCache; | ||||
|  * | ||||
|  * @param paths a list of file paths to be cached | ||||
|  * @param key an explicit key for restoring the cache | ||||
|  * @param enableCrossOsArchive an optional boolean enabled to save cache on windows which could be restored on any platform | ||||
|  * @param options cache upload options | ||||
|  * @returns number returns cacheId if the cache was saved successfully and throws an error if save fails | ||||
|  */ | ||||
| function saveCache(paths, key, options, enableCrossOsArchive = false) { | ||||
| function saveCache(paths, key, options) { | ||||
|     var _a, _b, _c, _d, _e; | ||||
|     return __awaiter(this, void 0, void 0, function* () { | ||||
|         checkPaths(paths); | ||||
| @@ -47444,7 +47444,6 @@ function saveCache(paths, key, options, enableCrossOsArchive = false) { | ||||
|             core.debug('Reserving Cache'); | ||||
|             const reserveCacheResponse = yield cacheHttpClient.reserveCache(key, paths, { | ||||
|                 compressionMethod, | ||||
|                 enableCrossOsArchive, | ||||
|                 cacheSize: archiveFileSize | ||||
|             }); | ||||
|             if ((_a = reserveCacheResponse === null || reserveCacheResponse === void 0 ? void 0 : reserveCacheResponse.result) === null || _a === void 0 ? void 0 : _a.cacheId) { | ||||
|   | ||||
							
								
								
									
										16
									
								
								examples.md
									
									
									
									
									
								
							
							
						
						
									
										16
									
								
								examples.md
									
									
									
									
									
								
							| @@ -38,7 +38,6 @@ | ||||
| - [Swift, Objective-C - Carthage](#swift-objective-c---carthage) | ||||
| - [Swift, Objective-C - CocoaPods](#swift-objective-c---cocoapods) | ||||
| - [Swift - Swift Package Manager](#swift---swift-package-manager) | ||||
| - [Swift - Mint](#swift---mint) | ||||
|  | ||||
| ## C# - NuGet | ||||
|  | ||||
| @@ -642,18 +641,3 @@ whenever possible: | ||||
|     restore-keys: | | ||||
|       ${{ runner.os }}-spm- | ||||
| ``` | ||||
|  | ||||
| ## Swift - Mint | ||||
|  | ||||
| ```yaml | ||||
| env: | ||||
|   MINT_PATH: .mint/lib | ||||
|   MINT_LINK_PATH: .mint/bin | ||||
| steps: | ||||
|   - uses: actions/cache@v3 | ||||
|     with: | ||||
|       path: .mint | ||||
|       key: ${{ runner.os }}-mint-${{ hashFiles('**/Mintfile') }} | ||||
|       restore-keys: | | ||||
|         ${{ runner.os }}-mint- | ||||
| ``` | ||||
|   | ||||
							
								
								
									
										18
									
								
								package-lock.json
									
									
									
										generated
									
									
									
								
							
							
						
						
									
										18
									
								
								package-lock.json
									
									
									
										generated
									
									
									
								
							| @@ -1,15 +1,15 @@ | ||||
| { | ||||
|   "name": "cache", | ||||
|   "version": "3.2.2", | ||||
|   "version": "3.2.1", | ||||
|   "lockfileVersion": 2, | ||||
|   "requires": true, | ||||
|   "packages": { | ||||
|     "": { | ||||
|       "name": "cache", | ||||
|       "version": "3.2.2", | ||||
|       "version": "3.2.1", | ||||
|       "license": "MIT", | ||||
|       "dependencies": { | ||||
|         "@actions/cache": "^3.1.2", | ||||
|         "@actions/cache": "^3.1.0", | ||||
|         "@actions/core": "^1.10.0", | ||||
|         "@actions/exec": "^1.1.1", | ||||
|         "@actions/io": "^1.1.2" | ||||
| @@ -36,9 +36,9 @@ | ||||
|       } | ||||
|     }, | ||||
|     "node_modules/@actions/cache": { | ||||
|       "version": "3.1.2", | ||||
|       "resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.2.tgz", | ||||
|       "integrity": "sha512-3XeKcXIonfIbqvW7gPm/VLOhv1RHQ1dtTgSBCH6OUhCgSTii9bEVgu0PIms7UbLnXeMCKFzECfpbud8fJEvBbQ==", | ||||
|       "version": "3.1.0", | ||||
|       "resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0.tgz", | ||||
|       "integrity": "sha512-wKGJkpK3uFTgwy+KA0fxz0H3/ZPymdi0IlyhMmyoMeWd+CIv8xVPWdGlrPDDdN9bFgve2yvEPZVaKRb43Uwtyg==", | ||||
|       "dependencies": { | ||||
|         "@actions/core": "^1.10.0", | ||||
|         "@actions/exec": "^1.0.1", | ||||
| @@ -9722,9 +9722,9 @@ | ||||
|   }, | ||||
|   "dependencies": { | ||||
|     "@actions/cache": { | ||||
|       "version": "3.1.2", | ||||
|       "resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.2.tgz", | ||||
|       "integrity": "sha512-3XeKcXIonfIbqvW7gPm/VLOhv1RHQ1dtTgSBCH6OUhCgSTii9bEVgu0PIms7UbLnXeMCKFzECfpbud8fJEvBbQ==", | ||||
|       "version": "3.1.0", | ||||
|       "resolved": "https://registry.npmjs.org/@actions/cache/-/cache-3.1.0.tgz", | ||||
|       "integrity": "sha512-wKGJkpK3uFTgwy+KA0fxz0H3/ZPymdi0IlyhMmyoMeWd+CIv8xVPWdGlrPDDdN9bFgve2yvEPZVaKRb43Uwtyg==", | ||||
|       "requires": { | ||||
|         "@actions/core": "^1.10.0", | ||||
|         "@actions/exec": "^1.0.1", | ||||
|   | ||||
| @@ -1,6 +1,6 @@ | ||||
| { | ||||
|   "name": "cache", | ||||
|   "version": "3.2.2", | ||||
|   "version": "3.2.1", | ||||
|   "private": true, | ||||
|   "description": "Cache dependencies and build outputs", | ||||
|   "main": "dist/restore/index.js", | ||||
| @@ -23,7 +23,7 @@ | ||||
|   "author": "GitHub", | ||||
|   "license": "MIT", | ||||
|   "dependencies": { | ||||
|     "@actions/cache": "^3.1.2", | ||||
|     "@actions/cache": "^3.1.0", | ||||
|     "@actions/core": "^1.10.0", | ||||
|     "@actions/exec": "^1.1.1", | ||||
|     "@actions/io": "^1.1.2" | ||||
|   | ||||
| @@ -120,7 +120,7 @@ steps: | ||||
|  | ||||
| #### Reusing primary key and restored key in the save action | ||||
|  | ||||
| Usually you may want to use same `key` in both `actions/cache/restore` and `actions/cache/save` action. To achieve this, use `outputs` from the restore action to reuse the same primary key (or the key of the cache that was restored). | ||||
| Usually you may want to use same `key` in both actions/cache/restore` and `actions/cache/save` action. To achieve this, use `outputs` from the restore action to reuse the same primary key (or the key of the cache that was restored). | ||||
|  | ||||
| #### Using restore action outputs to make save action behave just like the cache action | ||||
|  | ||||
|   | ||||
| @@ -11,10 +11,6 @@ inputs: | ||||
|   restore-keys: | ||||
|     description: 'An ordered list of keys to use for restoring stale cache if no cache hit occurred for key. Note `cache-hit` returns false in this case.' | ||||
|     required: false | ||||
|   enableCrossOsArchive: | ||||
|     description: 'An optional boolean when enabled, allows windows runners to restore caches that were saved on other platforms' | ||||
|     default: 'false' | ||||
|     required: false | ||||
| outputs: | ||||
|   cache-hit: | ||||
|     description: 'A boolean value to indicate an exact match was found for the primary key' | ||||
|   | ||||
| @@ -54,7 +54,7 @@ Case 1: Where an user would want to reuse the key as it is | ||||
| ```yaml | ||||
| uses: actions/cache/save@v3 | ||||
| with: | ||||
|     key: ${{ steps.restore-cache.outputs.key }} | ||||
|     key: steps.restore-cache.output.key | ||||
| ``` | ||||
|  | ||||
| Case 2: Where the user would want to re-evaluate the key | ||||
|   | ||||
| @@ -11,10 +11,6 @@ inputs: | ||||
|   upload-chunk-size: | ||||
|     description: 'The chunk size used to split up large files during upload, in bytes' | ||||
|     required: false | ||||
|   enableCrossOsArchive: | ||||
|     description: 'An optional boolean when enabled, allows windows runners to save caches that can be restored on other platforms' | ||||
|     default: 'false' | ||||
|     required: false | ||||
| runs: | ||||
|   using: 'node16' | ||||
|   main: '../dist/save-only/index.js' | ||||
|   | ||||
| @@ -2,8 +2,7 @@ export enum Inputs { | ||||
|     Key = "key", // Input for cache, restore, save action | ||||
|     Path = "path", // Input for cache, restore, save action | ||||
|     RestoreKeys = "restore-keys", // Input for cache, restore action | ||||
|     UploadChunkSize = "upload-chunk-size", // Input for cache, save action | ||||
|     EnableCrossOsArchive = "enableCrossOsArchive" // Input for cache, restore, save action | ||||
|     UploadChunkSize = "upload-chunk-size" // Input for cache, save action | ||||
| } | ||||
|  | ||||
| export enum Outputs { | ||||
|   | ||||
| @@ -31,16 +31,11 @@ async function restoreImpl( | ||||
|         const cachePaths = utils.getInputAsArray(Inputs.Path, { | ||||
|             required: true | ||||
|         }); | ||||
|         const enableCrossOsArchive = utils.getInputAsBool( | ||||
|             Inputs.EnableCrossOsArchive | ||||
|         ); | ||||
|  | ||||
|         const cacheKey = await cache.restoreCache( | ||||
|             cachePaths, | ||||
|             primaryKey, | ||||
|             restoreKeys, | ||||
|             {}, | ||||
|             enableCrossOsArchive | ||||
|             restoreKeys | ||||
|         ); | ||||
|  | ||||
|         if (!cacheKey) { | ||||
|   | ||||
| @@ -52,16 +52,9 @@ async function saveImpl(stateProvider: IStateProvider): Promise<number | void> { | ||||
|             required: true | ||||
|         }); | ||||
|  | ||||
|         const enableCrossOsArchive = utils.getInputAsBool( | ||||
|             Inputs.EnableCrossOsArchive | ||||
|         ); | ||||
|  | ||||
|         cacheId = await cache.saveCache( | ||||
|             cachePaths, | ||||
|             primaryKey, | ||||
|             { uploadChunkSize: utils.getInputAsInt(Inputs.UploadChunkSize) }, | ||||
|             enableCrossOsArchive | ||||
|         ); | ||||
|         cacheId = await cache.saveCache(cachePaths, primaryKey, { | ||||
|             uploadChunkSize: utils.getInputAsInt(Inputs.UploadChunkSize) | ||||
|         }); | ||||
|  | ||||
|         if (cacheId != -1) { | ||||
|             core.info(`Cache saved with key: ${primaryKey}`); | ||||
|   | ||||
| @@ -52,14 +52,6 @@ export function getInputAsInt( | ||||
|     return value; | ||||
| } | ||||
|  | ||||
| export function getInputAsBool( | ||||
|     name: string, | ||||
|     options?: core.InputOptions | ||||
| ): boolean { | ||||
|     const result = core.getInput(name, options); | ||||
|     return result.toLowerCase() === "true"; | ||||
| } | ||||
|  | ||||
| export function isCacheFeatureAvailable(): boolean { | ||||
|     if (cache.isFeatureAvailable()) { | ||||
|         return true; | ||||
|   | ||||
| @@ -13,7 +13,6 @@ interface CacheInput { | ||||
|     path: string; | ||||
|     key: string; | ||||
|     restoreKeys?: string[]; | ||||
|     enableCrossOsArchive?: boolean; | ||||
| } | ||||
|  | ||||
| export function setInputs(input: CacheInput): void { | ||||
| @@ -21,11 +20,6 @@ export function setInputs(input: CacheInput): void { | ||||
|     setInput(Inputs.Key, input.key); | ||||
|     input.restoreKeys && | ||||
|         setInput(Inputs.RestoreKeys, input.restoreKeys.join("\n")); | ||||
|     input.enableCrossOsArchive !== undefined && | ||||
|         setInput( | ||||
|             Inputs.EnableCrossOsArchive, | ||||
|             input.enableCrossOsArchive.toString() | ||||
|         ); | ||||
| } | ||||
|  | ||||
| export function clearInputs(): void { | ||||
| @@ -33,5 +27,4 @@ export function clearInputs(): void { | ||||
|     delete process.env[getInputName(Inputs.Key)]; | ||||
|     delete process.env[getInputName(Inputs.RestoreKeys)]; | ||||
|     delete process.env[getInputName(Inputs.UploadChunkSize)]; | ||||
|     delete process.env[getInputName(Inputs.EnableCrossOsArchive)]; | ||||
| } | ||||
|   | ||||
| @@ -19,24 +19,6 @@ A cache today is immutable and cannot be updated. But some use cases require the | ||||
| ## Use cache across feature branches | ||||
| Reusing cache across feature branches is not allowed today to provide cache [isolation](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache). However if both feature branches are from the default branch, a good way to achieve this is to ensure that the default branch has a cache. This cache will then be consumable by both feature branches. | ||||
|  | ||||
| ## Improving cache restore performance on Windows/Using cross-os caching | ||||
| Currently, cache restore is slow on Windows due to tar being inherently slow and the compression algorithm `gzip` in use. `zstd` is the default algorithm in use on linux and macos. It was disabled on Windows due to issues with bsd tar(libarchive), the tar implementation in use on Windows.  | ||||
|  | ||||
| To improve cache restore performance, we can re-enable `zstd` as the compression algorithm using the following workaround. Add the following step to your workflow before the cache step: | ||||
|  | ||||
| ```yaml | ||||
|     - if: ${{ runner.os == 'Windows' }} | ||||
|       name: Use GNU tar | ||||
|       shell: cmd | ||||
|       run: | | ||||
|         echo "Adding GNU tar to PATH" | ||||
|         echo C:\Program Files\Git\usr\bin>>"%GITHUB_PATH%" | ||||
| ``` | ||||
|  | ||||
| The `cache` action will use GNU tar instead of bsd tar on Windows. This should work on all Github Hosted runners as it is. For self-hosted runners, please ensure you have GNU tar and `zstd` installed. | ||||
|  | ||||
| The above workaround is also needed if you wish to use cross-os caching since difference of compression algorithms will result in different cache versions for the same cache key. So the above workaround will ensure `zstd` is used for caching on all platforms thus resulting in the same cache version for the same cache key. | ||||
|  | ||||
| ## Force deletion of caches overriding default cache eviction policy | ||||
| Caches have [branch scope restriction](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#restrictions-for-accessing-a-cache) in place. This means that if caches for a specific branch are using a lot of storage quota, it may result into more frequently used caches from `default` branch getting thrashed. For example, if there are many pull requests happening on a repo and are creating caches, these cannot be used in default branch scope but will still occupy a lot of space till they get cleaned up by [eviction policy](https://docs.github.com/en/actions/using-workflows/caching-dependencies-to-speed-up-workflows#usage-limits-and-eviction-policy). But sometime we want to clean them up on a faster cadence so as to ensure default branch is not thrashing. In order to achieve this, [gh-actions-cache cli](https://github.com/actions/gh-actions-cache/) can be used to delete caches for specific branches. | ||||
|  | ||||
|   | ||||
		Reference in New Issue
	
	Block a user