2026-03-05 07:31:05 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: zomato_order_history) 2026-03-05 07:31:05 [scrapy.utils.log] INFO: Versions: lxml 5.3.0.0, libxml2 2.12.9, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.11.0, Python 3.10.12 (main, Jan 26 2026, 14:55:28) [GCC 11.4.0], pyOpenSSL 24.3.0 (OpenSSL 3.4.0 22 Oct 2024), cryptography 44.0.0, Platform Linux-6.8.0-1044-aws-aarch64-with-glibc2.35 2026-03-05 07:31:05 [zomato_discounts] INFO: Dynamic attribute _job = 3_2026-03-05T07_30_02 2026-03-05 07:31:05 [zomato_discounts] INFO: Dynamic attribute scheduled_job_id = 7368 2026-03-05 07:31:05 [zomato_discounts] INFO: Dynamic attribute SCRAPEOPS_JOB_NAME = 3 2026-03-05 07:31:05 [scrapy.addons] INFO: Enabled addons: [] 2026-03-05 07:31:05 [scrapy.extensions.telnet] INFO: Telnet Password: 485b767a3bc83537 2026-03-05 07:31:05 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapeops_scrapy.extension.ScrapeOpsMonitor'] 2026-03-05 07:31:05 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'zomato_order_history', 'CONCURRENT_REQUESTS': 8, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/home/ubuntu/restaverse_spiders/logs/zomato_order_history/zomato_discounts/3_2026-03-05T07_30_02.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'zomato_order_history.spiders', 'RETRY_HTTP_CODES': [500, 502, 503, 504], 'RETRY_TIMES': 1, 'SPIDER_MODULES': ['zomato_order_history.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} 2026-03-05 07:31:11 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'zomato_order_history.middlewares.ZomatoCentralRateLimitMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy_user_agents.middlewares.RandomUserAgentMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapeops_scrapy.middleware.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2026-03-05 07:31:11 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2026-03-05 07:31:11 [scrapy.middleware] INFO: Enabled item pipelines: ['zomato_order_history.pipelines.ZomatoOrderHistoryPipeline'] 2026-03-05 07:31:11 [scrapy.core.engine] INFO: Spider opened 2026-03-05 07:31:11 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2026-03-05 07:31:12 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6026 2026-03-05 07:32:11 [scrapy.extensions.logstats] INFO: Crawled 5 pages (at 5 pages/min), scraped 0 items (at 0 items/min) 2026-03-05 07:33:08 [scrapy.core.downloader.handlers.http11] WARNING: Got data loss in https://prod-jumbo-insights-service.s3.ap-southeast-1.amazonaws.com/merchant-api-gateway/mx_order_history_download_v4/v1/merchant-api-gateway%3A%3Amx_order_history_download_v4%3A%3Av1%3A%3A6449272297541959878%3A%3A1772695800000/order_history_20260304_20260304.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Checksum-Mode=ENABLED&X-Amz-Credential=ASIA5RVPIJ4ERAGBHVZE%2F20260305%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20260305T073137Z&X-Amz-Expires=7200&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEPz%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDmFwLXNvdXRoZWFzdC0xIkcwRQIhAIdLnEk3uEdETRDSX%2Fc%2FQObu5yWtRDJ9Nk%2BrsyYerJDoAiBQjLH8Og1pcPtX9eXxfqKywb8LvT5Sp6ZAueMueVpvOyqHBAjF%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F8BEAAaDDkzMTMwMTcwNzUyOSIMiWW74huFQmRtOUYgKtsDBYWhBV1BYUL1oljIhfZWgCzfwBEr2L8CIHyipqE2ZSnlmotzV9FhT3dmjR54ssr5kLNaqqFVaLsEVtvauk37qdsLUE2Q%2BgVtPR7TMbGGCKYAfrylhS4ZovXrU%2FlATW0rlRDjVZLlV7DIhBQr0VG7H3HlSSbJdfsnFiCRbjYkbp%2FoCgWSRuQbpHHzX12L2z3uR3uDs6WX6J6HotuM4zD7ZvFkXL9luPcKi8VtkZ1ZWAqUEgU9Q1ebIOwZGD5m9XeELOxyeH2v2TRwCiV9gidlXtQxD%2B7Ha2KavLBZvjVS4UDWq6eMtW4K66rJEYH1Rn2zv27TyDlbQwVDoChML5wKe7Cpp7ak45YF0EEXaefIG9Apz%2FFzQMTlpttng2SVhtFJD1Bq6%2FKNDV2pLQGfe3MTPk%2FWI%2BnOEtYzKO4ptC3IJohSt5bwpkixEcvUDAgJN7ILtU%2FQ25tPmwYIkF0Uz1mjqYmva9GyCn8gc9ccfNl8wip%2BJO7zhkBO5h%2FscodiBF9S4wL6eHJmcUskK1eAUDhvJp%2FA2%2B8K2UabORNUmXPlH6M1xxgU85WQjgFJcFord1I9%2FFdZqwJX%2FH%2FG9MaoK%2FyNvkIROVQYXRgRT2wnADydGOqpsn15a9YdLt3eXDCQ8aPNBjqlAf3PHgQTDU9CPFi63vaHPoHM5N1pohQP3urRGEvTBHjztWeFSo%2BcmG%2FSw%2BD4ZDfur%2BxZgPkEa%2FfYqZEiXZQmfTk0nF5Mcu%2FPvU3bK8KHpX3UvzDryTCfMFRQTElpI5PqXxJGrH6d6e7yI1i3XCNEniH96gNYtgNCH%2BqTl5N7oIQ%2BfCHO5o5KBHpnNgft8A9YD8KJ9qhALE%2FKAEXfFRksKuLSUIIRpw%3D%3D&X-Amz-SignedHeaders=host&x-id=GetObject&X-Amz-Signature=c4539ed465f065380029f34544ab7a120fec5b6f53351e22236d88d11a95a8e7. If you want to process broken responses set the setting DOWNLOAD_FAIL_ON_DATALOSS = False -- This message won't be shown in further requests 2026-03-05 07:33:11 [py.warnings] WARNING: /home/ubuntu/restaverse_spiders/eggs/zomato_order_history/1770610074.egg/zomato_order_history/spiders/discounts.py:378: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy df_delivered["Discount construct"] = df_delivered[ 2026-03-05 07:33:11 [scrapy.core.engine] INFO: Closing spider (finished) 2026-03-05 07:33:12 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'client_id': '3', 'downloader/exception_count': 1, 'downloader/exception_type_count/twisted.web._newclient.ResponseFailed': 1, 'downloader/request_bytes': 28247, 'downloader/request_count': 7, 'downloader/request_method_count/GET': 6, 'downloader/request_method_count/POST': 1, 'downloader/response_bytes': 553303, 'downloader/response_count': 6, 'downloader/response_status_count/200': 6, 'elapsed_time_seconds': 119.832071, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2026, 3, 5, 7, 33, 11, 757356, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 318108, 'httpcompression/response_count': 5, 'item_scraped_count': 1, 'items_per_minute': None, 'log_count/INFO': 11, 'log_count/WARNING': 2, 'memusage/max': 128249856, 'memusage/startup': 114880512, 'request_depth_max': 5, 'response_received_count': 6, 'responses_per_minute': None, 'retry/count': 1, 'retry/reason_count/twisted.web._newclient.ResponseFailed': 1, 'scheduler/dequeued': 7, 'scheduler/dequeued/memory': 7, 'scheduler/enqueued': 7, 'scheduler/enqueued/memory': 7, 'start_time': datetime.datetime(2026, 3, 5, 7, 31, 11, 925285, tzinfo=datetime.timezone.utc)} 2026-03-05 07:33:12 [scrapy.core.engine] INFO: Spider closed (finished)