©XSIBackup-Free: Free Backup Software for ©VMWare ©ESXi

Forum ©XSIBackup: ©VMWare ©ESXi Backup Software


You are not logged in.

#1 Re: General matters » Cron problem » 2019-01-25 11:49:57

Please do you have someone update xsibackup free? i can try upgrade this version.

#2 Re: General matters » Cron problem » 2019-01-24 07:32:45

roberto wrote:

Many people, including ourselves, are using this cron. It is indeed working for you, as the test file demonstrates. It should not be that difficult to debug where your problem is:

1 - Backup your current crontab
2 - Call your crontab manually: # /var/spool/cron/crontabs/root
3 - Tail -f your xsibackup.log file in a different SSH window to see if something is being written there:
tail -f /vmfs/volumes/datastore1/xsi-dir/var/logs/xsibackup.log
4 - Reset permissions on your crontab:
chmod 0600 /var/spool/cron/crontabs/root
5 - Set some new execution time in the crontab.
6 - Check your tail -f window.
7 - If it still doesn't work open a new SSH window and tail -f your ESXi server's logs:
tail -f /scratch/log/syslog.log
...

And don't forget to check if there are ongoing XSIBackup processes by using ps -c, so that you don't pile processes up and clog execution.
ps -c | grep xsi

Hello i try and i see this.
[root@localhost:~] /var/spool/cron/crontabs/root
/var/spool/cron/crontabs/root: line 2: 1: not found
/var/spool/cron/crontabs/root: line 3: 1: not found
/var/spool/cron/crontabs/root: line 4: 0: not found
/var/spool/cron/crontabs/root: line 5: */5: not found
/var/spool/cron/crontabs/root: line 6: 00: not found
/var/spool/cron/crontabs/root: line 7: 07: not found

i try change
#min hour day mon dow command
1    1    *   *   *   /sbin/tmpwatch.py
1    *    *   *   *   /sbin/auto-backup.sh
0    *    *   *   *   /usr/lib/vmware/vmksummary/log-heartbeat.py
*/5  *    *   *   *   /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
00   1    *   *   *   localcli storage core device purge
07 14 * * * "/vmfs/volumes/datastore1/xsi-dir/jobs/001"

or

#min hour day mon dow command
1    1    *   *   *   /sbin/tmpwatch.py
1    *    *   *   *   /sbin/auto-backup.sh
0    *    *   *   *   /usr/lib/vmware/vmksummary/log-heartbeat.py
*/5  *    *   *   *   /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
00   1    *   *   *   localcli storage core device purge
07 14 * * * "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/jobs/001"

nothing work. Where is problem permisions is 0777

[root@localhost:~] tail -f /scratch/log/syslog.log
2019-01-24T07:46:54Z sftp-server[2184711]: open "/var/spool/cron/crontabs/root" flags WRITE,CREATE,TRUNCATE mode 0666
2019-01-24T07:46:54Z sftp-server[2184711]: close "/var/spool/cron/crontabs/root" bytes read 0 written 405
2019-01-24T07:46:54Z sftp-server[2184711]: set "/var/spool/cron/crontabs/root" modtime 20190124-07:46:54
2019-01-24T07:46:54Z sftp-server[2184711]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir"
2019-01-24T07:46:54Z sftp-server[2184711]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir"
2019-01-24T07:48:20Z sftp-server[2184711]: open "/var/spool/cron/crontabs/root" flags WRITE,CREATE,TRUNCATE mode 0666
2019-01-24T07:48:20Z sftp-server[2184711]: close "/var/spool/cron/crontabs/root" bytes read 0 written 405
2019-01-24T07:48:20Z sftp-server[2184711]: set "/var/spool/cron/crontabs/root" modtime 20190124-07:48:20
2019-01-24T07:48:20Z sftp-server[2184711]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir"
2019-01-24T07:48:20Z sftp-server[2184711]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir"

[root@localhost:~] ps -c | grep xsi
2185011  2185011  grep                            grep xsi


-----------------------------------------------------------------------

2019-01-24T08:05:01Z crond[2122361]: crond: USER root pid 2185012 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-01-24T08:05:01Z syslog[2185015]: starting hostd probing.
2019-01-24T08:05:18Z sftp-server[2184711]: open "/var/spool/cron/crontabs/root" flags WRITE,CREATE,TRUNCATE mode 0666
2019-01-24T08:05:18Z sftp-server[2184711]: close "/var/spool/cron/crontabs/root" bytes read 0 written 380
2019-01-24T08:05:18Z sftp-server[2184711]: set "/var/spool/cron/crontabs/root" modtime 20190124-08:05:18
2019-01-24T08:05:18Z sftp-server[2184711]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir"
2019-01-24T08:05:18Z sftp-server[2184711]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir"
2019-01-24T08:09:32Z sftp-server[2184711]: open "/var/spool/cron/crontabs/root" flags WRITE,CREATE,TRUNCATE mode 0666
2019-01-24T08:09:32Z sftp-server[2184711]: close "/var/spool/cron/crontabs/root" bytes read 0 written 380
2019-01-24T08:09:32Z sftp-server[2184711]: set "/var/spool/cron/crontabs/root" modtime 20190124-08:09:32
2019-01-24T08:09:32Z sftp-server[2184711]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir"
2019-01-24T08:09:32Z sftp-server[2184711]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir"
2019-01-24T08:10:01Z crond[2122361]: crond: USER root pid 2185049 cmd /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
2019-01-24T08:10:01Z syslog[2185052]: starting hostd probing.
2019-01-24T08:11:48Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52                                                                                              ce0e70/xsi-dir/var"
2019-01-24T08:11:48Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae5                                                                                              2ce0e70/xsi-dir/var"
2019-01-24T08:11:49Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52                                                                                              ce0e70/xsi-dir/var/logs"
2019-01-24T08:11:49Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae5                                                                                              2ce0e70/xsi-dir/var/logs"
2019-01-24T08:11:59Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52                                                                                              ce0e70/xsi-dir/var/logs"
2019-01-24T08:11:59Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae5                                                                                              2ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:00Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52                                                                                              ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:00Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae5                                                                                              2ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:16Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:16Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:16Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:16Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:17Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:17Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:17Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:17Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:18Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:18Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:19Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:19Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:28Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:28Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:28Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:28Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:29Z sftp-server[2184666]: opendir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"
2019-01-24T08:12:29Z sftp-server[2184666]: closedir "/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir/var/logs"

#3 Re: General matters » Cron problem » 2019-01-23 09:07:38

I have this problem exactly. I have not resolved it until now. I'm already thinking about downgrading vmware to esx 6 version and xsibackup to 9 version.

#4 Re: General matters » Cron problem » 2019-01-21 14:48:02

Hyrules wrote:

I'm having the same issue with the latest XSI backup free and VMWare ESXI 6.7. Can't get the backup to run with the cron.

Have you tried something like this could be done?

#5 Re: General matters » Cron problem » 2019-01-21 14:15:13

admin wrote:

ESXi Cron daemon can get picky, follow [https://33hops.com/xsibackup-cron-troubleshooting.html](c)VMWare (c)ESXi cron troubleshooting guide[/url]

what could I still try? Generating date to file works but cron job 001 does not start sad

#6 General matters » Cron problem » 2019-01-17 13:47:58

paulee
Replies: 22

My cron does not work on xsibackup jobs 001

I try this solutions.
for p in $(ps -c | grep -v grep | grep 'busybox crond' | awk '{print $1}'); \
do kill -9 $p;done; \
/usr/lib/vmware/busybox/bin/busybox crond; \
ps -c | grep -v grep | grep 'busybox crond'

i see only this

[root@localhost:/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir] for p in $(ps -c | grep -v grep | grep 'busybox crond' | awk '{print $1}'); \
> do kill -9 $p;done; \
> /usr/lib/vmware/busybox/bin/busybox crond; \
> ps -c | grep -v grep | grep 'busybox crond'
2122361  2122361  busybox                         /usr/lib/vmware/busybox/bin/busybox crond
[root@localhost:/vmfs/volumes/5c0a2ccd-1cc20aa1-0de9-d4ae52ce0e70/xsi-dir]

And cron dont work sad

So, make sure that you have the entries that you want there. You can check the cron is working by just adding a line like this to your crontab at: /var/spool/cron/crontabs/root

*/1 * * * * echo "$(date)" >> /tmp/my-cron-test.txt
You should then see a line being recorded there every minute, you can use this command in a different window:

tail -f /tmp/my-cron-test.txt

i see this

[root@localhost:~] tail -f /tmp/my-cron-test.txt
Thu Jan 17 13:47:01 UTC 2019
Thu Jan 17 13:48:01 UTC 2019
Thu Jan 17 13:49:01 UTC 2019
Thu Jan 17 13:50:01 UTC 2019

/var/spool/cron/crontabs/root
#min hour day mon dow command
1    1    *   *   *   /sbin/tmpwatch.py
1    *    *   *   *   /sbin/auto-backup.sh
0    *    *   *   *   /usr/lib/vmware/vmksummary/log-heartbeat.py
*/5  *    *   *   *   /bin/hostd-probe.sh ++group=host/vim/vmvisor/hostd-probe/stats/sh
00   1    *   *   *   localcli storage core device purge

50 13 * * * "/vmfs/volumes/datastore1/xsi-dir/jobs/001"
*/1 * * * * echo "$(date)" >> /tmp/my-cron-test.txt


Cron on vmware have someone log file?

#7 Re: General matters » No space » 2019-01-16 10:43:33

Now backup file is 201901160924  this is correct name folder? or bad?
--backup-point=/vmfs/volumes/synology/zalohy/$( date +%Y%m%d%H%M ) \

#8 Re: General matters » No space » 2019-01-15 07:19:38

Ok i receive this email.
----------------------------------
Done hot backup (id: 001) using vmkfstools (no compression)
The backup room has been limited to 690 Gb.
Available room in device /vmfs/volumes/synology/zalohy/20190114-1554 before backup: 132 Gb.
Sparse size on disk of the selected virtual machines: 160 Gb.
Needed room in device /vmfs/volumes/synology/zalohy/20190114-1554 for backup: 160 Gb.
(Id)VM Name    State    Size (Gb)    Stop    Copy    Start    Time (min)    Speed (mb/s)
(1) voip-ustredna    ON    10/ 10    NO (hot backup)    OK    -    1    97/ 97
(3) server    ON    150/ 150    NO (hot backup)    KO!    -    63    40/ 40
The eldest folders were deleted to make room:
Error MKROOM01: cannot make 162G of room, only 130G can be made available
Last error raised for the above VM:
ERROR CLVMKFS1, details: [server] error: vmkfstools error, details: Failed to clone disk: There is not enough space on the file system for the selected operation (13).
Available space in device /vmfs/volumes/synology/zalohy/20190114-1554 after backup: 130 Gb.
Complete backup elapsed time: 65 min
Backup of ESXi configuration is not available in XSIBACKUP-FREE
Get XSIBACKUP-PRO at https://33hops.com
• [ Mon Jan 14 15:56:03 UTC 2019 ] ERROR (MKROOM01), details Error: cannot make 162G of room, only 130G can be made available

• [ Mon Jan 14 16:59:34 UTC 2019 ] ERROR (CLVMKFS1), details [server] error: vmkfstools error, details: Failed to clone disk: There is not enough space on the file system for the selected operation (13).
--------------------------------------------------


and i have cron job

----------------------------------
"/vmfs/volumes/datastore1/xsi-dir/xsibackup" \
--backup-prog=vmkfstools \
--certify-backup=yes \
--backup-point=/vmfs/volumes/synology/zalohy/$( date +%Y%m%d-%H%M ) \
--backup-type=Custom \
--backup-vms="voip-ustredna,server" \
--backup-how=Hot \
--remote-xsipath=/vmfs/volumes/datastore1/xsi-dir \
--use-smtp=1 \
--mail-to=admin@gmail.com \
--backup-id=001 \
--backup-room=690 \
--description="Zalohovanie" \
--override=xsibakfilter \
--on-success="backupId->001" \
--on-error="backupId->001" \
--exec=yes >> "/vmfs/volumes/datastore1/xsi-dir/var/logs/xsibackup.log"
---------------------------------------

I have a backup room but the script can not delete the old backup and I do not know why. When I manually delete them all goes further. But the script can not delete the old one.

#9 Re: General matters » No space » 2019-01-14 09:59:19

When I enter this report from the limit, it is 690GB and it will not be able to erase the capacity so that it frees up additional capacity. Why can not you clear it? Where can you make a mistake?

#10 Re: General matters » No space » 2018-12-31 09:37:06

Why backup room dont work? sad where i can manage backup room? thank you

#11 General matters » No space » 2018-12-13 07:20:28

paulee
Replies: 9

Hello please how i can manage disk space on jobs?
Last error raised for the above VM:
ERROR CLVMKFS1, details: [server] error: vmkfstools error, details: Failed to clone disk: There is not enough space on the file system for the selected operation (13).

i have this jobs

"/vmfs/volumes/datastore1/xsi-dir/xsibackup" \
--certify-backup=yes \
--backup-point=/vmfs/volumes/synology/zalohy/$( date +%Y%m%d-%H%M ) \
--backup-type=Custom \
--backup-vms="voip-ustredna,server" \
--backup-how=Hot \
--remote-xsipath=/vmfs/volumes/datastore1/xsi-dir \
--use-smtp=1 \
--backup-room=690 \
this backup-room dont work sad

Board footer