e30_dk Posted August 11, 2014 Share Posted August 11, 2014 1..6.0.9For that I have transferred some data from PrestaShop 1.5 for PrestaShop I was unfortunately deleted all the goods in 1.5 and have been trying to load my backup which is 66,711,265 bytes?I have tried to import directly from my myphpadmin, but after a while I was told that it was too big.As far as I know it must be possible - what good is a backup if you can not get it reloaded when an accident ...?Can anyone tell me how I get loaded my backup?Thanks in advancee30_dk Link to comment Share on other sites More sharing options...
dioniz Posted August 11, 2014 Share Posted August 11, 2014 You should ask your host to do it for you Link to comment Share on other sites More sharing options...
bellini13 Posted August 11, 2014 Share Posted August 11, 2014 you can logically split the backup file into multiple smaller files. Most of that large data is likely the connection tables, which you might even consider skipping if you don't care about your statistics in the back office. Link to comment Share on other sites More sharing options...
e30_dk Posted August 11, 2014 Author Share Posted August 11, 2014 I am not aware of how to divide the database into smaller units. Can i just edit the backup up into different smaller files and copy them into independent units, thus importing these items into the database?I do not care about the statistics of as soon as I got the database to ten running again and I have deleted the approximately 300 Items that will need updating for PrestaShop 1.6.0.9 Link to comment Share on other sites More sharing options...
bellini13 Posted August 11, 2014 Share Posted August 11, 2014 you would need to open the sql file using a text editor, and then logically split the file into smaller files, by copying/pasting sections into other text files. You cannot arbitrarily split the file, it needs to be split logically. Think of the backup file like this, there will be 2 sections, multiplied by the number of tables in the database. 1) The first section will be the "create table" statement 2) The second section will be the "insert into table" statement Those 2 sections then repeat for each table, and I would expect about you to have about 250 tables. So you do not want to split the files in the middle of those sections. Sorry, its not that easy to explain in a forum post. Link to comment Share on other sites More sharing options...
cedricfontaine Posted August 11, 2014 Share Posted August 11, 2014 BigDump is a simple php and it does everything for you. http://www.ozerov.de/bigdump/usage/ Link to comment Share on other sites More sharing options...
bellini13 Posted August 11, 2014 Share Posted August 11, 2014 unfortunately the Note1 on their website is what will probably cause this to fail. Usually phpmyadmin exports will utilize extended inserts. Note 1: BigDump will fail processing large tables containing extended inserts. An extended insert contains all table entries within one SQL query. BigDump isn’t able to split such SQL queries. In most cases BigDump will stop if some query includes to many lines. But if PHP complains that allowed memory size exhausted or MySQL server has gone away your dump probably also contains extended inserts. Please turn off extended inserts when exporting database from phpMyAdmin. If you only have a dump file with extended inserts please ask for our support service in order to convert it into a file usable by BigDump. Link to comment Share on other sites More sharing options...
e30_dk Posted August 25, 2014 Author Share Posted August 25, 2014 Here is the result of my contact to BigDump BigDump is currently not able to restore a single dump file with multiple databases inside (switched by the USE statement). BigDump is also not able to restore a single specific database from the dump file containing multiple databases. so what now? Link to comment Share on other sites More sharing options...
Recommended Posts