Skip to content

Commit 5d64523

Browse files
author
Nate Przybyszewski
committed
Initial commit
0 parents  commit 5d64523

File tree

7 files changed

+699
-0
lines changed

7 files changed

+699
-0
lines changed

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
*.psd1
2+
dist/

LICENSE

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
Copyright 2018 Nathan Przybyszewski
2+
3+
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
4+
5+
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
6+
7+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

README.md

Lines changed: 85 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,85 @@
1+
# PowerShell AWS S3 Simple Backup
2+
This is a PowerShell module for performing simple backups to S3. Also included is a Lambda function for managing the uploaded backups. The Lambda function will age-out old backups while still retaining certain backups (ex. first backup of each year, of each quarter, of each month, etc) for a specified amount of time. Lambda function also posts to CloudWatch for each newly uploaded backup so that an alarm can be created to alert if backups stop.
3+
4+
# End-host Configuration
5+
1. Download script from Github and install as desired. In this example we'll
6+
keep it simple by saving the entire directory to C:\backups.
7+
2. Copy backup_settings.template.txt file, rename it, and edit it as desired.
8+
3. Create backup script. Script can be executed as a Windows Scheduled Task. Depending what you're attempting to backup, you may need to add additional commands to your script. For example, if you're backing up a MySQL database, you'll want to use mysqldump to export the database before backing up the resulting file to S3.
9+
10+
```
11+
Import-Module C:\backups\powershell-aws-simple-s3-backup\AwsSimpleS3Backup.psd1
12+
13+
# First argument is the backup name.
14+
# Second argument is the location to backup files from
15+
# Third argument is the settings file that will be used to determine backup behavior
16+
Backup-FilesToS3 "documents" "C:\Users\Administrator\My Documents\*" "C:\backups\powershell-aws-simple-s3-backup\backup_settings.txt"
17+
```
18+
19+
# Lambda Installation
20+
1. Upload script from tools/lambda directory to your AWS account's Lambda console as a Python 3.x script.
21+
2. See Lambda Example IAM Role Policy below for an example policy that you can use for this script.
22+
3. Set up Lambda to call this function upon S3 "ObjectCreated" events from your S3 bucket.
23+
4. If desired, configure CloudWatch alarms to alert you if backups stop arriving. You will need to wait until after the first execution before the metric appears. Backups must occur at least once per day for this to be effective.
24+
5. Currently, S3 Lifecycle Policies must be configured manually for pruning to work. The Lambda script will tag each archive with the "keep-days" tag and a value indicating the number of days that the archive should be retained. For each possible retention period, you must create the desired lifecycle policies. You can create policies to delete the objects, or for longer-lived objects transition them to Glacier then eventually delete them. For example:
25+
* keep-days=7: Expire current version of object after 7 days from object creation
26+
* keep-days=356: Transition to Amazon Clacier after 15 days from object creation, expire current version of object after 365 days from object creation.
27+
28+
# Lambda Configuration
29+
Edit the following variables in the Lambda script to suit your requirements.
30+
31+
archive_bucket: Name of the S3 bucket that you are uploading backups to
32+
33+
metric_namespace: Namespace to use when posting to CloudWatch. Can usually just be left at the default unless you prefer something else.
34+
35+
tier_mapping: should be modified with your desired retention periods. The __default mapping applies to all uploaded backups unless overridden by another. The number associated with each interval indicates how many days the first backup of each interval will be retained. For example, imagine a backup is uploaded once per week on the following days: Jan 1, Jan 8, Jan 15, Jan 22, Jan 29, Feb 5, Feb 12. The Jan 1 backup will be retained for 3650 days because it is the first backup of the year. The Feb 5 backup will be retained for 365 days because it is the first backup of the month of February. The other backups in this example will be retained only 15 days.
36+
37+
```
38+
[3650, 730, 365, 90, 30, 15]
39+
| | | | | |
40+
| | | | | |- Others
41+
| | | | |- Daily
42+
| | | |- Weekly
43+
| | |- Monthly
44+
| |- Quarterly
45+
|- Yearly
46+
```
47+
48+
# Lambda Example IAM Role Policy
49+
```
50+
{
51+
"Version": "2012-10-17",
52+
"Statement": [
53+
{
54+
"Sid": "Stmt1507002786000",
55+
"Effect": "Allow",
56+
"Action": [
57+
"s3:DeleteObject",
58+
"s3:PutObjectTagging",
59+
"s3:GetObjectTagging",
60+
"s3:PutObject",
61+
"s3:CopyObject",
62+
"s3:ListBucket"
63+
],
64+
"Resource": [
65+
"arn:aws:s3:::my.archive.bucket",
66+
"arn:aws:s3:::my.archive.bucket/*"
67+
]
68+
},
69+
{
70+
"Sid": "Stmt1507097378000",
71+
"Effect": "Allow",
72+
"Action": [
73+
"cloudwatch:PutMetricData"
74+
],
75+
"Resource": [
76+
"*"
77+
]
78+
}
79+
]
80+
}
81+
```
82+
83+
# Credits
84+
Powershell and Python script originally authored by Nathan Przybyszewski
85+
[powershell-script-module-boilerplate](https://github.com/jpoehls/powershell-script-module-boilerplate) used as a template

backup_settings.template.txt

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
# Backup settings for AWS-S3-Backup.ps1
2+
# Author: Nathan Przybyszewski <github.com/shibz>
3+
# Date: 2018-03-11
4+
#
5+
# Copy the backup_settings.template.txt file to a new file and rename
6+
# it to "backup_settings.txt". It must sit in the same directory as
7+
# your AWS-S3-Backup.ps1 file. Configure the values below as desired.
8+
9+
# Local Backup Directory. Subdirectories will be created for each backup
10+
# using the "name" specified with the Backup-Files command.
11+
LocalBackupDir=C:\Backups
12+
13+
# Number of days to retain local backups
14+
LocalBackupRetention=30
15+
16+
# Name of the event log source to use. The log source will need to be
17+
# initiated using the following cmdlet (executed as an administrator)
18+
# New-EventLog –LogName Application –Source "YourBackupSourceName"
19+
LogSource=S3Backup
20+
21+
# Password used to encrypt archives
22+
EncryptionPassword=some encryption password here
23+
24+
# S3 Configuration
25+
S3Bucket=your.s3.bucket.name
26+
S3Region=us-east-1
27+
AccessKey=AKI12345678901234567
28+
SecretKey=put your secret key here
29+
30+
# Enable/disable upload to S3. If disabled, local backups will continue.
31+
CloudUpload=true
32+
33+
# Post extra debug logs
34+
Debug=False

src/AwsSimpleS3Backup.psm1

Lines changed: 138 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,138 @@
1+
function Backup-FilesToS3 {
2+
<#
3+
.SYNOPSIS
4+
Creates backups, uploads them to AWS S3, and retains some locally
5+
6+
.DESCRIPTION
7+
This powershell module is designed to make creation of backups simple
8+
and easy and to allow for easy uploading of those backups to S3 for
9+
long-term archival and disaster recovery.
10+
11+
Author: Nathan Przybyszewski <github.com/shibz>
12+
License: MIT License
13+
14+
Before using this script, you need to create your backup_settings.txt
15+
file using the included template. You also should run the following
16+
command as administrator to set up logging:
17+
18+
New-EventLog -LogName Application -Source "YourBackupSourceName"
19+
20+
Replace YourBackupSourceName with the actual setting you configured
21+
in backup_settings.txt
22+
23+
.PARAMETER name
24+
Name for the backups. Date/time will be appended.
25+
26+
.PARAMETER source
27+
Location to backup.
28+
29+
.PARAMETER cfgFile
30+
Configuration file that will be used to deremine where to backup to.
31+
32+
.EXAMPLE
33+
Backup files locally and to S3 as specified by configuration file.
34+
35+
Backup-FilesToS3 "documents" "C:\Users\Administrator\My Documents\*" "C:\backups\backup_settings.txt"
36+
#>
37+
Param(
38+
[Parameter(Mandatory=$true)][string]$name,
39+
[Parameter(Mandatory=$true)][string]$source,
40+
[Parameter(Mandatory=$true)][string]$cfgFile
41+
)
42+
43+
# Ensure 7-zip is installed and the necessary backup settings file exists
44+
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
45+
if (-not (test-path "$cfgFile")) {throw "$cfgFile was not found. Copy backup_settings.template.txt and configure as desired. Location of your configuration must be passed to cmdlet as a parameter."}
46+
47+
# Fetch backup config file content and populate $backupSettings with values
48+
Get-Content "$cfgFile" | foreach-object -begin {$backupSettings=@{}} -process { $k = [regex]::split($_,'='); if(($k[0].CompareTo("") -ne 0) -and ($k[0].StartsWith("[") -ne $True) -and ($k[0].StartsWith("#") -ne $True)) { $backupSettings.Add($k[0], $k[1]) } }
49+
50+
# Set the logging-related variables
51+
$script:logSource = $backupSettings.LogSource
52+
$script:debug = $backupSettings.Debug
53+
54+
# Set the AWS credentials
55+
Set-AWSCredential -AccessKey "$($backupSettings.AccessKey)" -SecretKey "$($backupSettings.SecretKey)"
56+
Set-DefaultAWSRegion -Region "$($backupSettings.S3Region)"
57+
58+
$now = Get-Date
59+
$timestamp = Get-Date -Date $now -format yyyy-MM-dd_H-mm-ss
60+
$backupDir = "$($backupSettings.LocalBackupDir)\$name"
61+
$backupName = "$backupDir\$name`_$timestamp.bak.7z"
62+
$cloudUpload = $backupSettings.CloudUpload -like "true"
63+
Write-BackupLog "Beginning backup for $name to $backupName.`r`nWill attempt to save:`r`n$source" 110
64+
65+
if ($source -is [array]) {
66+
$szoutput = Zip-Files "$backupName" @source
67+
} else {
68+
$szoutput = Zip-Files "$backupName" $source
69+
}
70+
Write-BackupDebug "7Zip completed. Output was:`r`n$szoutput" 120
71+
72+
if ($cloudUpload) {
73+
$uploadlog = Upload-Backup $backupName $backupSettings.S3Bucket "$name.7z" $now
74+
} else {
75+
$uploadlog = "Backup was not uploaded to S3"
76+
}
77+
78+
$backupMax = $now.AddDays([int]$backupSettings.LocalBackupRetention * -1)
79+
$oldfiles = Get-ChildItem $backupdir -recurse -include "$backupname_*.bak.7z" | Where-Object { $_.CreationTime -le $backupMax }
80+
if ($oldfiles) {
81+
$measurement = $oldfiles | Measure-Object -property length -sum
82+
$pruneCount = $measurement.Count
83+
$pruneSize = [math]::Round($measurement.Sum / 1KB)
84+
$prunelog = "Pruning old archives.`r`nWill attempt to prune $pruneCount files totalling $pruneSize KB:`r`n$oldfiles`r`n$uploadlog"
85+
Write-BackupDebug $prunelog 140
86+
Remove-Item $oldfiles
87+
} else {
88+
$prunelog = "No files found for pruning"
89+
}
90+
91+
Write-BackupLog "Backup for $name to $backupName is complete!`r`nBackup target was: $source`r`n`r`n7Zip output was:`r`n$szoutput`r`n$prunelog`r`n$backuplog" 150
92+
}
93+
94+
# Create an encrypted 7-zip archive
95+
function Zip-Files {
96+
"$env:ProgramFiles\7-Zip\7z.exe" a -mx7 -mhe -mmt -p"$($backupSettings.EncryptionPassword)" $args
97+
}
98+
99+
# Write to event log, only if debug mode is enabled
100+
function Write-BackupDebug($message, $id) {
101+
if ($script:debug) {
102+
Write-EventLog -LogName Application -Source $script:logSource -EventId "$id" -Message "$message"
103+
echo $message
104+
}
105+
}
106+
107+
# Write informational message to event log
108+
function Write-BackupLog($message, $id) {
109+
Write-EventLog -LogName Application -Source $script:logSource -EventId "$id" -Message "$message"
110+
echo $message
111+
}
112+
113+
# Write error message to event log
114+
function Write-BackupError($message, $id) {
115+
Write-EventLog -LogName Application -Source $script:logSource -EventId "$id" -Message "$message" -EntryType Error
116+
echo $message
117+
}
118+
119+
# Upload the given file to S3
120+
function Upload-Backup($file, $bucket, $key, $date) {
121+
$prefixDate = Get-Date -Date $date -format yyyy-MM-dd/HH-mm-ss
122+
$archiveKey = "$prefixDate`_$key"
123+
$backupLog = ""
124+
125+
Write-BackupDebug "Attempting to upload $file to s3://$bucket/$archiveKey" 135
126+
try {
127+
Write-S3Object -BucketName $bucket -File $file -Key $archiveKey
128+
$backupLog = "Uploaded $file to s3://$bucket/$archiveKey"
129+
} catch {
130+
$errormsg = $_.Exception.Message
131+
Write-BackupError "Error uploading to S3! Error was:`r`n$errormsg" 235
132+
}
133+
Write-BackupDebug $backupLog 130
134+
135+
return $backupLog
136+
}
137+
138+
Export-ModuleMember -Function Backup-FilesToS3

0 commit comments

Comments
 (0)