Direct Admin remote backup

I couldn't find a built-in option to create regular (daily) backups of my DirectAdmin shared hosting account (regular user, no admin/reseller) and upload them to a remote server using SFTP (for example).

There's something in the docs here but that method seems to be unavailable for regular users: https://docs.directadmin.com/directadmin/backup-restore-migration/backup-to-remote.html

Am I missing something?

Otherwise, my idea is to try to create a script which triggers a manual backup over the DirectAdmin API, waits for a while and then downloads the backup. But maybe anyone already has such script and would be willing to share it?

dnscry.pt - Public DNSCrypt resolvers hosted by LowEnd providers • Need a free NAT LXC? -> https://microlxc.net/

Comments

  • edited March 15

    If you're a regular user and your host didn't setup or enabled backup functionality, you can only write a script that manually collect files and pushes them to a remote location. So the feature you linked depends on the host.

    Thanked by (1)Brueggus
  • Here's what I came up with: After some digging around, I found an old but very useful thread on OGF which shows how to use the DA Backup API using curl. It looked like this would only create a backup for a single domain, so I tried to find more information on the API endpoint CMD_API_SITE_BACKUP and stumbled upon this script (which likely works as well as using curl but looks more clean to me - even though it's written in PHP :| ). The script suggests it should be invoked on the DA server, but it works just as fine remotely if you change the $host variable.

    Depending on where you run it, I wouldn't recommend putting your DA password as $password in the script. This won't work if you use 2FA (you should, if you don't!) anyway. Instead, create a "Login Key" in your Direct Admin panel, limit it to calling CMD_API_SITE_BACKUP and your IP range. Don't even try to put an IPv6 subnet in there, the form will swallow any slashes. Put a single address and DA will convert it to the encompassing /32 when you save it. Close enough, I guess.

    Then, create a FTP user which has access to your /backups directory.

    On your remote server, put the script in place. Then, follow the instructions in the script.

    You need to have the php-curl extension installed and enabled. If you plan to call the script directly, remove the -n in the first line, otherwise PHP likely won't load the extension.

    I run the script remotely, therefore I don't need an ftp upload, but a download instead. The config file for ncftp looks the same as described in the script. Put your DA server as host and the FTP user you created above as user/pass.

    I've created a small script to download the backup from the DA server and make sure that I won't run out of disk space by only keeping the 7 most recent backups:

    #!/bin/bash
    
    FTP_SETTINGS=/home/webbackup/.ncftp_conf
    BACKUP_DIR=/home/webbackup/backups
    KEEP=6 
    
    cd "$BACKUP_DIR"
    ls -tp | grep -v '/$' | tail -n +$KEEP | xargs -d '\n' -r rm --
    ncftpget -f $FTP_SETTINGS "$BACKUP_DIR" "/backup-`date +%b-%d-%Y`-1.tar.zst"
    

    In my case, creating the backup on the DA server takes only few minutes. So I've created two cronjobs:

    5    4     *   *  *       /bin/php /home/webbackup/create_backup.php
    15   4     *   *  *       /home/webbackup/download_backup.sh
    

    I wish I had done this earlier since OneVirt/ViridWeb decided to go out of service without prior notice without me having a proper backup. Would have saved me some hours of work...

    dnscry.pt - Public DNSCrypt resolvers hosted by LowEnd providers • Need a free NAT LXC? -> https://microlxc.net/

Sign In or Register to comment.