Running out of memory using Database

Hi,
I am trying to migrate a database of products from Wordpress / WooCommerce to Kirby. The biggest table wp_posts is 600MB+.
When I try to access this table from Kirby I run into error Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 24576 bytes) in /Users/gvocale/Sites/wordpress-export/kirby/toolkit/lib/database.php on line 318

I have increased the memory in php.ini to 4096MB still without any luck.

The command I’m trying to run is:

<?php foreach($wp_posts->limit(50) as $wp_post): ?>
<?php echo $wp_post->title() ?>
<?php endforeach ?>

What can I do?

Have you tried to set it to unlimited with -1? (Never do this in production)
Have you tried with a smaller subset of data?
Has your setting actually increased memory size?

50 is quite a high limit. What happens if you paginate them? Im guessing that will hit the database for them in chunks rather then whole 50 at once, meaning less memory use, but thats just a hunch.

Do you have to keep the data as a database? @texnixe made a plugin for CSV import which will create Kirby articles from a spreadsheet. You could go down that route.

This is for importing from a database to Kirby. So the data has to come out of the datebase in the first place…

Well yes it does… the original poster struggling to get the database data working directly, so i figured doing it in batches via CSV was a potential solution. Easy enough to get a CSV export via SqlPro. PHPMyAdmin will likely fall over with a data that large.

Well, yes, you are right, didn’t think about exporting to a different format, maybe shouldn’t post in the middle of the night.

Nevertheless, it should be possible to do this right from the database, I have successfully done it in the past multiple times (with smaller databases though). But as I already suggested above, try with smaller chunks.

@gvocale Is that piece of code you quoted above really the code that causes the memory error? The line of code it refers to in the database class seems to suggest it is rather the query that is causing this. Maybe reduce the number of articles in the query by setting a from-to date or something like this.

Probably the best memory saving option would be to fetch the data row by row instead of saving it all into a variable with fetchAll().

This seem to have fixed it for now! Thank you :slight_smile:

I’ll mark this as solved for the moment. Guess you are only doing this on localhost. For a one off, setting the memory to unlimited is ok, just make sure to set it back afterward.

The best practice solution in the long run for dealing with that much data would be the row by row fetching I suggested above, however.

Feel free to reopen if you encounter any issues.