Jonathon Posted June 7, 2016 Report Posted June 7, 2016 Hey Everyone I just started looking at using the automated backup in Blesta. I have created and tested the sFTP account to anther server and that works just fine. The web user can execute the mysqldump - I added some debug code and su'd to the user to test the command line produceds and it worked just fine. What is happening is the system is getting a gateway time out on force or download backup via the web. When I stop the process in the buildDump() I can see the database.sql file and it is 100% good. When I let it try and compress the files this is where is goes wrong. I get a zero byte .gz file and it just times out. If I comment out the .gz of the file - and just have the function return the uncompressed sql file sFTP fails - no additional information given as to why. but when I do a check sFTP connection in the settings that comes back as good. Any thoughts or help would be appreciated Jonathon M OnehostingPlan.com Quote
0 Michael Posted June 7, 2016 Report Posted June 7, 2016 May I ask why you are adding and removing .gz, you don't need to touch them on the SFTP server since it should gz them already: Quote
0 Jonathon Posted June 7, 2016 Author Report Posted June 7, 2016 If I comment out the gz of the file I do not get the nginx timeout and I can see the .sql file The core backup.php file is the same - By letting it not do the gz I can see the mysql dump has completed but looks to be hanging on the gz function. If I return with the .sql file as the file variable the sftp upload then fails. but if I do an ondemand download I can get the .sql file but not a .gz file Hope that helps - Just trying to break down the process to see where and why it is failing. Summary 2 parts 1) the sql to gz creates a zero byte file 2) sftp fails with .sql file with no other details. Quote
0 Michael Posted June 7, 2016 Report Posted June 7, 2016 If I comment out the gz of the file I do not get the nginx timeout and I can see the .sql file The core backup.php file is the same - By letting it not do the gz I can see the mysql dump has completed but looks to be hanging on the gz function. If I return with the .sql file as the file variable the sftp upload then fails. but if I do an ondemand download I can get the .sql file but not a .gz file Hope that helps - Just trying to break down the process to see where and why it is failing. Summary 2 parts 1) the sql to gz creates a zero byte file 2) sftp fails with .sql file with no other details. So you're editing core files to get it to work on your server? I would use a better webhost. Quote
0 Jonathon Posted June 8, 2016 Author Report Posted June 8, 2016 Editing Core files to find why it will not run as it is -- yes Its not a host thing as it is one of our ubuntu custom build boxes. Quote
0 cloudrck Posted June 8, 2016 Report Posted June 8, 2016 Check your webserver logs, if the database is too large you may have to tweak settings to compensate for the time it takes for blesta to execute the command. You should have the domain setup with it's own error log file under Nginx. To be honest it might be better to create a shell script that does the backup, and rsync over SSH. I'm looking to implement this on my setup since I'm having issues with Blesta automated backups. Quote
0 Paul Posted June 8, 2016 Report Posted June 8, 2016 Does your server have the tar command? For example, can you run something like: tar -cvzf archive.tar.gz /path/to/something/* Quote
0 Jonathon Posted June 9, 2016 Author Report Posted June 9, 2016 Hi Paul, If i change the code in the /app/model/backup.php private function buildDump() { $db_info = Configure::get("Database.profile"); Loader::loadComponents($this, array("SettingsCollection")); $temp = $this->SettingsCollection->fetchSystemSetting(null, "temp_dir"); $temp_dir = (isset($temp['value']) ? $temp['value'] : null); $this->Input->setRules($this->getRules()); $vars = array( 'temp_dir' => $temp_dir, 'db_info' => $db_info ); if ($this->Input->validates($vars)) { // ISO 8601 $file = $db_info['database'] . "_" . date("Y-m-d\THis\Z"); $file_name = $temp_dir . $file . ".sql"; // $test_command = "mysqldump --host=" . escapeshellarg($db_info['host']) . " --user=" . escapeshellarg($db_info['user']) ." --password=" . escapes$ //echo $test_command; //file_put_contents("mysql_dump.txt", $test_command); exec("mysqldump --host=" . escapeshellarg($db_info['host']) . " --user=" . escapeshellarg($db_info['user']) . " --password=" . escapeshellarg($db_info['pass']) . " " . escapeshellarg($db_info['database']) . " > " . escapeshellarg($file_name)); // GZip the file if possible if (function_exists("gzopen")) { $chunk_size = 4096; $compress_file_name = $file_name . ".gz"; // Compress as much as possible $gz = gzopen($compress_file_name, "w9"); $fh = fopen($file_name, 'rb'); // Read from the original and write in chunks to preserve memory // while (!feof($fh)) { // $data = fread($fh, $chunk_size); // if ($data) // gzwrite($gz, $data); // } // unset($data); gzwrite($gz, $fh); $compressed = gzclose($gz); // Remove the original data file if ($compressed) { unlink($file_name); return $compress_file_name; } } return $file_name; } } I does work as expected. I know long term issues with reading the full file into memory with out chunking it will cause an issue. Not looking to modify the core - but just understand what part of the function is failing. Please note: maybe part of a bug but the function being checked for is not the same function being used to do the compress. Yes I can tar files in command with no issue even if I su to the www user running the site. Thanks for looking Quote
0 Paul Posted June 10, 2016 Report Posted June 10, 2016 That's interesting, sounds as if feof is returning true immediately, rather than at EOF. What version of PHP in your CLI environment? Quote
0 Kurogane Posted October 29, 2016 Report Posted October 29, 2016 I've the same issue i'm using PHP 5.6. When i click force offsite backup give me fail if i click download backup give me a backup without an issue. There is anyway to do backup via clic to see why is failing? webserver and php logs errors not show nothing. Quote
0 Michael Posted October 29, 2016 Report Posted October 29, 2016 11 minutes ago, Kurogane said: I've the same issue i'm using PHP 5.6. When i click force offsite backup give me fail if i click download backup give me a backup without an issue. There is anyway to do backup via clic to see why is failing? webserver and php logs errors not show nothing. Does the test settings work? Quote
0 Kurogane Posted October 29, 2016 Report Posted October 29, 2016 Yes, because the remote backup give me zero byte .gz file. Yes i have mysqldump and working fine via console, i just want to run via cli too see the actual error why is failing. Quote
0 Paul Posted October 31, 2016 Report Posted October 31, 2016 I would enable error reporting in /config/blesta.php (change error reporting from "0" to "-1" value). Then, disable the cron temporarily, and run it manually when the next backup should be processed. Settings > System > Automation. It may output more errors. If the issue is only when your CLI/cron environment runs it, and not via Web, then you can execute the same cron command via SSH/CLI instead. Quote
0 Kurogane Posted November 1, 2016 Report Posted November 1, 2016 What time run the backup? i set up to 1 day but i can't find what time. The backup via web click "Force Offsite Backup" or via cron not work, the only thing is work "Download Backup" via Web. Quote
0 Kurogane Posted November 2, 2016 Report Posted November 2, 2016 After a day and run cron report not errors and the SFTP give me a zero byte .gz file. Attempting to run all tasks for Company. Attempting to renew services and create invoices. The create invoices task has completed. Attempting to apply credits to open invoices. There are no invoices to which credits may be applied. The apply credits task has completed. Attempting to auto debit open invoices. The auto debit invoices task has completed. Attempting to deliver invoices scheduled for delivery. No invoices are scheduled to be delivered. The deliver invoices task has completed. Attempting to provision paid pending services. The paid pending services task has completed. Attempting to suspend past due services. The suspend services task has completed. Attempting to unsuspend paid suspended services. The unsuspend services task has completed. Attempting to cancel scheduled services. The cancel scheduled services task has completed. Attempting to process service changes. The process service changes task has completed. Attempting to process renewing services. The process renewing services task has completed. Attempting to send payment reminders. The payment reminders task has completed. Attempting plugin cron for order accept_paid_orders. Finished plugin cron for order accept_paid_orders. Attempting plugin cron for support_manager poll_tickets. Finished plugin cron for support_manager poll_tickets. Attempting plugin cron for support_manager close_tickets. Finished plugin cron for support_manager close_tickets. Attempting to clean up old logs. 0 old Gateway logs have been deleted. 0 old Module logs have been deleted. The clean logs task has completed. All tasks have been completed. Attempting to run all system tasks. Attempting to validate the license. The license validation task has completed. Attempting to backup the database to AmazonS3. The backup completed successfully. The AmazonS3 database backup task has completed. Attempting to backup the database via SFTP. The backup completed successfully. The SFTP database backup task has completed. All system tasks have been completed. Quote
Question
Jonathon
Hey Everyone
I just started looking at using the automated backup in Blesta.
I have created and tested the sFTP account to anther server and that works just fine.
The web user can execute the mysqldump - I added some debug code and su'd to the user to test the command line produceds and it worked just fine.
What is happening is the system is getting a gateway time out on force or download backup via the web.
When I stop the process in the buildDump() I can see the database.sql file and it is 100% good.
When I let it try and compress the files this is where is goes wrong.
I get a zero byte .gz file and it just times out.
If I comment out the .gz of the file - and just have the function return the uncompressed sql file
sFTP fails - no additional information given as to why.
but when I do a check sFTP connection in the settings that comes back as good.
Any thoughts or help would be appreciated
Jonathon M
OnehostingPlan.com
15 answers to this question
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.