You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A project I'm working on has 300k failed jobs. When I try to run artisan queue:retry it results in an error...
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted
After looking into how the RetryCommand works, I realized that rather than querying just the IDs from the failed jobs table, it loads the entire table and then plucks the ID column. No matter how high I set the memory limit, the server won't be able to handle loading all 300k failed jobs into memory.
This is a very inefficient way to load the failed job IDs. Any thoughts on how this could be improved without breaking anything?
Steps To Reproduce
Populate the failed_jobs table with 300k failed jobs