2026-05-02 07:30:17 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: zomato_order_history) 2026-05-02 07:30:17 [scrapy.utils.log] INFO: Versions: lxml 5.3.0.0, libxml2 2.12.9, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.11.0, Python 3.10.12 (main, Mar 3 2026, 11:56:32) [GCC 11.4.0], pyOpenSSL 24.3.0 (OpenSSL 3.4.0 22 Oct 2024), cryptography 44.0.0, Platform Linux-6.8.0-1044-aws-aarch64-with-glibc2.35 2026-05-02 07:30:17 [zomato_discounts] INFO: Dynamic attribute _job = 107_2026-05-02T07_30_02 2026-05-02 07:30:17 [zomato_discounts] INFO: Dynamic attribute scheduled_job_id = 7382 2026-05-02 07:30:17 [zomato_discounts] INFO: Dynamic attribute SCRAPEOPS_JOB_NAME = 107 2026-05-02 07:30:17 [scrapy.addons] INFO: Enabled addons: [] 2026-05-02 07:30:17 [scrapy.extensions.telnet] INFO: Telnet Password: 51d263c92aecbf81 2026-05-02 07:30:17 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapeops_scrapy.extension.ScrapeOpsMonitor'] 2026-05-02 07:30:17 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'zomato_order_history', 'CONCURRENT_REQUESTS': 8, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/home/ubuntu/restaverse_spiders/logs/zomato_order_history/zomato_discounts/107_2026-05-02T07_30_02.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'zomato_order_history.spiders', 'RETRY_HTTP_CODES': [500, 502, 503, 504], 'RETRY_TIMES': 1, 'SPIDER_MODULES': ['zomato_order_history.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} 2026-05-02 07:30:33 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'zomato_order_history.middlewares.ZomatoCentralRateLimitMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy_user_agents.middlewares.RandomUserAgentMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapeops_scrapy.middleware.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2026-05-02 07:30:33 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2026-05-02 07:30:33 [scrapy.middleware] INFO: Enabled item pipelines: ['zomato_order_history.pipelines.ZomatoOrderHistoryPipeline'] 2026-05-02 07:30:33 [scrapy.core.engine] INFO: Spider opened 2026-05-02 07:30:33 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2026-05-02 07:30:34 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6029 2026-05-02 07:30:47 [zomato_discounts] INFO: Job merchant-api-gateway::mx_order_history_download_v4::v1::16099137037729797498::1777707000000 still in progress. Poll attempt 1 2026-05-02 07:31:33 [scrapy.extensions.logstats] INFO: Crawled 6 pages (at 6 pages/min), scraped 0 items (at 0 items/min) 2026-05-02 07:31:49 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: zomato_order_history) 2026-05-02 07:31:49 [scrapy.utils.log] INFO: Versions: lxml 5.3.0.0, libxml2 2.12.9, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.11.0, Python 3.10.12 (main, Mar 3 2026, 11:56:32) [GCC 11.4.0], pyOpenSSL 24.3.0 (OpenSSL 3.4.0 22 Oct 2024), cryptography 44.0.0, Platform Linux-6.8.0-1044-aws-aarch64-with-glibc2.35 2026-05-02 07:31:49 [zomato_discounts] INFO: Dynamic attribute _job = 107_2026-05-02T07_30_02 2026-05-02 07:31:49 [zomato_discounts] INFO: Dynamic attribute scheduled_job_id = 11264 2026-05-02 07:31:49 [zomato_discounts] INFO: Dynamic attribute SCRAPEOPS_JOB_NAME = 107 2026-05-02 07:31:49 [scrapy.addons] INFO: Enabled addons: [] 2026-05-02 07:31:49 [scrapy.extensions.telnet] INFO: Telnet Password: adc902647d28b362 2026-05-02 07:31:49 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapeops_scrapy.extension.ScrapeOpsMonitor'] 2026-05-02 07:31:49 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'zomato_order_history', 'CONCURRENT_REQUESTS': 8, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/home/ubuntu/restaverse_spiders/logs/zomato_order_history/zomato_discounts/107_2026-05-02T07_30_02.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'zomato_order_history.spiders', 'RETRY_HTTP_CODES': [500, 502, 503, 504], 'RETRY_TIMES': 1, 'SPIDER_MODULES': ['zomato_order_history.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} 2026-05-02 07:31:54 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'zomato_order_history.middlewares.ZomatoCentralRateLimitMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy_user_agents.middlewares.RandomUserAgentMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapeops_scrapy.middleware.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2026-05-02 07:31:54 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2026-05-02 07:31:54 [scrapy.middleware] INFO: Enabled item pipelines: ['zomato_order_history.pipelines.ZomatoOrderHistoryPipeline'] 2026-05-02 07:31:54 [scrapy.core.engine] INFO: Spider opened 2026-05-02 07:31:54 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2026-05-02 07:31:54 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6025 2026-05-02 07:32:05 [py.warnings] WARNING: /home/ubuntu/restaverse_spiders/eggs/zomato_order_history/1776486973.egg/zomato_order_history/spiders/discounts.py:378: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy df_delivered["Discount construct"] = df_delivered[ 2026-05-02 07:32:05 [scrapy.core.engine] INFO: Closing spider (finished) 2026-05-02 07:32:05 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'client_id': '107', 'downloader/request_bytes': 24137, 'downloader/request_count': 6, 'downloader/request_method_count/GET': 5, 'downloader/request_method_count/POST': 1, 'downloader/response_bytes': 138465, 'downloader/response_count': 6, 'downloader/response_status_count/200': 6, 'elapsed_time_seconds': 11.234863, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2026, 5, 2, 7, 32, 5, 324941, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 106836, 'httpcompression/response_count': 5, 'item_scraped_count': 1, 'items_per_minute': None, 'log_count/INFO': 10, 'log_count/WARNING': 1, 'memusage/max': 114905088, 'memusage/startup': 114905088, 'request_depth_max': 5, 'response_received_count': 6, 'responses_per_minute': None, 'scheduler/dequeued': 6, 'scheduler/dequeued/memory': 6, 'scheduler/enqueued': 6, 'scheduler/enqueued/memory': 6, 'start_time': datetime.datetime(2026, 5, 2, 7, 31, 54, 90078, tzinfo=datetime.timezone.utc)} 2026-05-02 07:32:05 [scrapy.core.engine] INFO: Spider closed (finished) 2026-05-02 07:32:07 [scrapy.core.downloader.handlers.http11] WARNING: Got data loss in https://prod-jumbo-insights-service.s3.ap-southeast-1.amazonaws.com/merchant-api-gateway/mx_order_history_download_v4/v1/merchant-api-gateway%3A%3Amx_order_history_download_v4%3A%3Av1%3A%3A16099137037729797498%3A%3A1777707000000/order_history_20260501_20260501.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Checksum-Mode=ENABLED&X-Amz-Credential=ASIA5RVPIJ4E6CMMLNW3%2F20260502%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20260502T073054Z&X-Amz-Expires=21600&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEG8aDmFwLXNvdXRoZWFzdC0xIkcwRQIgJvJOAjfS481INgsSTLulwk5omemVcvvXwPJuiiOO4VECIQDaEUf4EjQXRQ%2BcYqrNr0aYuVvumHaLzHzrLCT6KwgphSr%2BAwg4EAAaDDkzMTMwMTcwNzUyOSIMF0NkYGRHBZotaRn%2BKtsDuyCBWRrG8mFXwUYmVzmwb%2FkdawFHskXDlZy5TWvcNJzA8o%2BfXv%2BtVNLTq55Y0ftRTNWJMT7j9kzwkO04p0g%2F5rLDpqr1sLUebzgC5Xun4kHXiBDJL%2BHmWSovG%2FtmbUpfe2JEUJY4LgimIT2LwVgE2P1ItsQCim8dTbbO6%2Fq0WWs7mfi5oXSCdy%2BPkB1f5H2mJLS1HybUVMq%2FOKCbUqhan3E2jSHdT6MV1Z6r2Sla3PGpwdJPLIFyiEDiMJnStNvMU%2FOrd%2B%2BTK34Z5WmCkL9x02rOSP3zJrnLNuXW3piWX5z6%2FzZCA7%2BwLcJ%2FcTKi0YbOUihyBjHE%2BXNmo1S0tNElvPDiF2u8lwitSrRvuF9YVARuSxDW4L%2FRbxJAyKpmnFj9C0G4eHXkRj%2FOpvUDWEgwA%2B3P36eNLe51HLzdiOhNKyIop7I%2FhRMVd3RqKuY9Qo99xkAv6jX9gCOLHaWVUCESvqU79YzoPeHFg5KOWLKVHCeHAQmuAuMiHaMDRm0JnBPR%2BrK5wTBCIKjszH0JI7aNUEB3H6MxMffZw%2BeOi1XeZI1k1A%2BJN%2BTSSN%2FWlPWhFNV8ehqn2X2ImCgD%2FlBCAcBQxiMVqCgYShRJ8IXaGyfZw0h%2FVHP5243zbx6nzTClv9bPBjqlAY4XHVl86kPri%2FhuJl9n%2FjRhoKknB7Oo7vQDOoKH44qLl6kg0vk3PRP6Pe7PjuGj3toN7vVYEJ%2FPzHbiAOF52gPfidRIyIdoAkT1PfDr5MrKoV5%2B19v43mRmDNoZ7T4p%2BEc2Fvy2ds0wAkCxHXwQ1%2BcoEI5oUrjKw8UCAE7c6pKI%2F0ZmvCEwQ%2B4BukxzOZPSlVymieKAlqje%2Fb7rMGFFQUu6qxcnsQ%3D%3D&X-Amz-SignedHeaders=host&x-id=GetObject&X-Amz-Signature=f716214b42da854029d8e20919697d56a3a903a140557f0f31030f052beb1ec8. If you want to process broken responses set the setting DOWNLOAD_FAIL_ON_DATALOSS = False -- This message won't be shown in further requests 2026-05-02 07:32:10 [py.warnings] WARNING: /home/ubuntu/restaverse_spiders/eggs/zomato_order_history/1776486973.egg/zomato_order_history/spiders/discounts.py:378: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy df_delivered["Discount construct"] = df_delivered[ 2026-05-02 07:32:10 [scrapy.core.engine] INFO: Closing spider (finished) 2026-05-02 07:32:10 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'client_id': '107', 'downloader/exception_count': 1, 'downloader/exception_type_count/twisted.web._newclient.ResponseFailed': 1, 'downloader/request_bytes': 31130, 'downloader/request_count': 8, 'downloader/request_method_count/GET': 7, 'downloader/request_method_count/POST': 1, 'downloader/response_bytes': 138359, 'downloader/response_count': 7, 'downloader/response_status_count/200': 7, 'elapsed_time_seconds': 97.237643, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2026, 5, 2, 7, 32, 10, 315056, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 104928, 'httpcompression/response_count': 4, 'item_scraped_count': 1, 'items_per_minute': None, 'log_count/INFO': 12, 'log_count/WARNING': 2, 'memusage/max': 127361024, 'memusage/startup': 114909184, 'request_depth_max': 6, 'response_received_count': 7, 'responses_per_minute': None, 'retry/count': 1, 'retry/reason_count/twisted.web._newclient.ResponseFailed': 1, 'scheduler/dequeued': 8, 'scheduler/dequeued/memory': 8, 'scheduler/enqueued': 8, 'scheduler/enqueued/memory': 8, 'start_time': datetime.datetime(2026, 5, 2, 7, 30, 33, 77413, tzinfo=datetime.timezone.utc)} 2026-05-02 07:32:10 [scrapy.core.engine] INFO: Spider closed (finished)