The free shipping bug was not related to your problem Lev'...
The issue we've got is with the windows path and, initially at least, the way the system checks if the feed is being generated in the cache directory or not... windows paths use the escape directory which is just stunningly unhelpful when it comes to regex.
// Free Shipping?
if(isset($row['prodfreeshipping']) && $row['prodfreeshipping'] != 1){
$entry[] = "<g:shipping><g:price><![CDATA[0]]></g:price></g:shipping>\n</entry>";
}
// Not free shipping so get a quote if we're in a relevant country.
elseif(in_array($countryiso, array('AU','US','GB','DE','FR'))) {
ADDED: Free shipping and Fixed shipping product settings now interpreted correctly
ADDED: More verbose information about products skipped and why
FIXED: Resolved path check for Windows hosted ISC stores
Version 1.4.7 : Revision 469-470
UPDATED: Resolved incompatibility with 6.1.7 onwards
UPDATED: 6.1.6 version of shoppingcomparison.categorybox.tpl included
ADDED: Now pulls data from 6.1.7 google product search table if available
UPDATED: US Google merchant taxonomy updated
PENDING: Does NOT yet pull google product type from 6.1.7 so those versions onwards need to set the google product as a product comparison AND set something for google product search tab to ensure the product saves.
Thank you for your work. The script works, and I'm exporting to US_froogle-export.xml
The problem is it loads SUPER fast at the beginning, like 100 products per sec. but then slower and slower. And right now it seems like taking 2 seconds for a product.
And I have 220k products, so it would be 440,000 seconds / 3600 = 122 hours ( 5 days)
you have any idea about this?
levinthan9 wrote:Thank you for your work. The script works, and I'm exporting to US_froogle-export.xml
Glad we fixed most of it...
The problem is it loads SUPER fast at the beginning, like 100 products per sec. but then slower and slower. And right now it seems like taking 2 seconds for a product.
And I have 220k products, so it would be 440,000 seconds / 3600 = 122 hours ( 5 days)
you have any idea about this?
I suspect it's an operating system issue in that Windows is probably loading the whole file into memory and appending the new data to the end, thus filling the file and adding yet more to memory...
Be useful to know how big the feed file size is now as an indicator...
Thanks for kind reply. Do you think it is because of apache / script time/memory limitation? If so, do you know where I can edit my apache to break that limit? I have 32GB memory with 6 cores cpu, so I don't think I will have any problem with high load apache/mysql.