Compare commits
	
		
			12 Commits
		
	
	
		
			users/ashw
			...
			v3_fix
		
	
	| Author | SHA1 | Date | |
|---|---|---|---|
|   | f5b55ae568 | ||
|   | 2387ae8f10 | ||
|   | 875b52c705 | ||
|   | 97ba7aa16d | ||
|   | 81375c4b58 | ||
|   | 9716b96acd | ||
|   | eeff95f289 | ||
|   | 8584e116c7 | ||
|   | 0c44ddbd87 | ||
|   | 8de84b9d78 | ||
|   | 128caf4219 | ||
|   | 2d8d0d1c9b | 
							
								
								
									
										10
									
								
								README.md
									
									
									
									
									
								
							
							
						
						
									
										10
									
								
								README.md
									
									
									
									
									
								
							| @@ -9,7 +9,11 @@ This action allows caching dependencies and build outputs to improve workflow ex | ||||
| See ["Caching dependencies to speed up workflows"](https://help.github.com/github/automating-your-workflow-with-github-actions/caching-dependencies-to-speed-up-workflows). | ||||
|  | ||||
| ## What's New | ||||
| ### v3 | ||||
| * Updated the minimum runner version support from node 12 -> node 16. | ||||
|  | ||||
| ### v2 | ||||
| * Increased the cache size limit to 10 GB. | ||||
| * Added support for multiple paths, [glob patterns](https://github.com/actions/toolkit/tree/main/packages/glob), and single file caches. | ||||
|  | ||||
| ```yaml | ||||
| @@ -177,6 +181,12 @@ steps: | ||||
|  | ||||
| Since GitHub Enterprise Server uses self-hosted runners, dependencies are typically cached on the runner by whatever dependency management tool is being used (npm, maven, etc.).  This eliminates the need for explicit caching in some scenarios. | ||||
|  | ||||
| ## Changelog schedule and history  | ||||
|  | ||||
| | Status  | Version  | Date  | Highlights  | | ||||
| |:---|:---|:---|:---| | ||||
| | Published  | v3.0.0  | Mar 21st, 2022 | -  Updated minimum runner version support from node 12 -> node 16 <br>  | | ||||
|  | ||||
| ## Contributing | ||||
| We would love for you to contribute to `actions/cache`, pull requests are welcome! Please see the [CONTRIBUTING.md](CONTRIBUTING.md) for more information. | ||||
|  | ||||
|   | ||||
							
								
								
									
										3
									
								
								dist/restore/index.js
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										3
									
								
								dist/restore/index.js
									
									
									
									
										vendored
									
									
								
							| @@ -5519,7 +5519,8 @@ function downloadCacheStorageSDK(archiveLocation, archivePath, options) { | ||||
|             //
 | ||||
|             // If the file exceeds the buffer maximum length (~1 GB on 32-bit systems and ~2 GB
 | ||||
|             // on 64-bit systems), split the download into multiple segments
 | ||||
|             const maxSegmentSize = buffer.constants.MAX_LENGTH; | ||||
|             // ~2 GB = 2147483647, beyond this, we start getting out of range error. So, capping it accordingly.
 | ||||
|             const maxSegmentSize = Math.min(2147483647, buffer.constants.MAX_LENGTH); | ||||
|             const downloadProgress = new DownloadProgress(contentLength); | ||||
|             const fd = fs.openSync(archivePath, 'w'); | ||||
|             try { | ||||
|   | ||||
							
								
								
									
										3
									
								
								dist/save/index.js
									
									
									
									
										vendored
									
									
								
							
							
						
						
									
										3
									
								
								dist/save/index.js
									
									
									
									
										vendored
									
									
								
							| @@ -5519,7 +5519,8 @@ function downloadCacheStorageSDK(archiveLocation, archivePath, options) { | ||||
|             //
 | ||||
|             // If the file exceeds the buffer maximum length (~1 GB on 32-bit systems and ~2 GB
 | ||||
|             // on 64-bit systems), split the download into multiple segments
 | ||||
|             const maxSegmentSize = buffer.constants.MAX_LENGTH; | ||||
|             // ~2 GB = 2147483647, beyond this, we start getting out of range error. So, capping it accordingly.
 | ||||
|             const maxSegmentSize = Math.min(2147483647, buffer.constants.MAX_LENGTH); | ||||
|             const downloadProgress = new DownloadProgress(contentLength); | ||||
|             const fd = fs.openSync(archivePath, 'w'); | ||||
|             try { | ||||
|   | ||||
							
								
								
									
										7317
									
								
								package-lock.json
									
									
									
										generated
									
									
									
								
							
							
						
						
									
										7317
									
								
								package-lock.json
									
									
									
										generated
									
									
									
								
							
										
											
												File diff suppressed because it is too large
												Load Diff
											
										
									
								
							| @@ -23,7 +23,7 @@ | ||||
|   "author": "GitHub", | ||||
|   "license": "MIT", | ||||
|   "dependencies": { | ||||
|     "@actions/cache": "^1.0.10", | ||||
|     "@actions/cache": "file:actions-cache-1.0.10.tgz", | ||||
|     "@actions/core": "^1.2.6", | ||||
|     "@actions/exec": "^1.1.1", | ||||
|     "@actions/io": "^1.1.2" | ||||
|   | ||||
		Reference in New Issue
	
	Block a user