WebFaction
Community site: login faq

Hi,

I'm webmaster for a client who is using Webfaction, and their disk usage exceeded their maximum 100GB limit over the weekend. I discovered about 200 backup files (two per day) going back months... While daily backups are great, isn't their a way to store a set # of backups and have the older ones removed automatically? We'd like to keep backups for the last 30 days only and are hoping we don't have to periodically go in to manually delete the older ones.

Thanks, Chris

asked 04 Jul '13, 13:57

cclay
12
accept rate: 0%


You can use the find command for this.

For example, if you want to delete files that are > 30 days old from a directory /home/username/backups:

find /home/username/backups -type f -ctime +30 | xargs rm

You could run that daily via cron to keep your backup directory tidy.

Hope that helps!

permanent link

answered 04 Jul '13, 14:12

seanf
12.2k42136
accept rate: 37%

Thanks, great idea!

Chris

(04 Jul '13, 22:39) cclay
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Question tags:

×36
×3

question asked: 04 Jul '13, 13:57

question was seen: 1,758 times

last updated: 04 Jul '13, 22:39

                              
WEBFACTION
REACH US
SUPPORT
LEGAL
© COPYRIGHT 2003-2020 SWARMA LIMITED - WEBFACTION IS A SERVICE OF SWARMA LIMITED
REGISTERED IN ENGLAND AND WALES 5729350 - VAT REGISTRATION NUMBER 877397162
5TH FLOOR, THE OLD VINYL FACTORY, HAYES, UB3 1HA, UNITED KINGDOM