2026-03-02 07:31:00 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: zomato_order_history) 2026-03-02 07:31:00 [scrapy.utils.log] INFO: Versions: lxml 5.3.0.0, libxml2 2.12.9, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.11.0, Python 3.10.12 (main, Jan 26 2026, 14:55:28) [GCC 11.4.0], pyOpenSSL 24.3.0 (OpenSSL 3.4.0 22 Oct 2024), cryptography 44.0.0, Platform Linux-6.8.0-1044-aws-aarch64-with-glibc2.35 2026-03-02 07:31:00 [zomato_discounts] INFO: Dynamic attribute _job = 70_2026-03-02T07_30_02 2026-03-02 07:31:00 [zomato_discounts] INFO: Dynamic attribute scheduled_job_id = 7373 2026-03-02 07:31:00 [zomato_discounts] INFO: Dynamic attribute SCRAPEOPS_JOB_NAME = 70 2026-03-02 07:31:00 [scrapy.addons] INFO: Enabled addons: [] 2026-03-02 07:31:00 [scrapy.extensions.telnet] INFO: Telnet Password: ca87ca0a0cb5b71c 2026-03-02 07:31:00 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapeops_scrapy.extension.ScrapeOpsMonitor'] 2026-03-02 07:31:00 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'zomato_order_history', 'CONCURRENT_REQUESTS': 8, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/home/ubuntu/restaverse_spiders/logs/zomato_order_history/zomato_discounts/70_2026-03-02T07_30_02.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'zomato_order_history.spiders', 'RETRY_HTTP_CODES': [500, 502, 503, 504], 'RETRY_TIMES': 1, 'SPIDER_MODULES': ['zomato_order_history.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} 2026-03-02 07:31:10 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'zomato_order_history.middlewares.ZomatoCentralRateLimitMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy_user_agents.middlewares.RandomUserAgentMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapeops_scrapy.middleware.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2026-03-02 07:31:10 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2026-03-02 07:31:10 [scrapy.middleware] INFO: Enabled item pipelines: ['zomato_order_history.pipelines.ZomatoOrderHistoryPipeline'] 2026-03-02 07:31:10 [scrapy.core.engine] INFO: Spider opened 2026-03-02 07:31:10 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2026-03-02 07:31:11 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6027 2026-03-02 07:32:11 [scrapy.extensions.logstats] INFO: Crawled 5 pages (at 5 pages/min), scraped 0 items (at 0 items/min) 2026-03-02 07:32:40 [scrapy.core.downloader.handlers.http11] WARNING: Got data loss in https://prod-jumbo-insights-service.s3.ap-southeast-1.amazonaws.com/merchant-api-gateway/mx_order_history_download_v4/v1/merchant-api-gateway%3A%3Amx_order_history_download_v4%3A%3Av1%3A%3A13522919050836274993%3A%3A1772436600000/order_history_20260301_20260301.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Checksum-Mode=ENABLED&X-Amz-Credential=ASIA5RVPIJ4ES2JKFK65%2F20260302%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20260302T073121Z&X-Amz-Expires=7200&X-Amz-Security-Token=IQoJb3JpZ2luX2VjELL%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDmFwLXNvdXRoZWFzdC0xIkcwRQIgKBqu2VEUqOkHieZXtCBgaWzDgKON06H%2B4QqiIr9YRvsCIQC2AaCK5qe0KSZVkNS7%2BMbC9u4PACdFmTbPDNs7AQSsCCr%2BAwh7EAAaDDkzMTMwMTcwNzUyOSIMK%2B8XZMFfGA5RfT1FKtsD2wGu3Uqy0gn9hK0TniQiNCDWxzHQMkjFv6WoeJ2VBaGaePyyfA94g1kLoMbhl6nsqij185NBfpm5j%2F2GzLvBGqNwOlqEnwrxC5RFELOvZs2QN4cbsHquOZWqncGI%2FxTenQgjhgAuhS%2FhiIruoCjNt9W3a5RyHfrvz9mxgc4EuaqA0xAnGPxep0mk0xhNJa60duLI5jUDTLhtXrNPwhKtl5bX84AodvnSQg8SoH3KAGkyRbHYAAiJpc2TtJ47DzJ0v6o%2BNamRbwiYPVUAksuy%2FNWeNY3mM%2F%2FYcD6fHPMhE7OUWb%2B0bFY0tkeXs5jOt8t46Xc8LcX3jE0xZm2h24XwCYvpuDVO93RekYJP%2FBcHOLHWG56NIqgeRzYkeQ6C4CPYl26pAGRLSSTuqfRwbN1YtKnzBg7%2BXC7xdnJ2WVSqYyVEOHBoEK8DsFsiqS1XKU0lLCdWq2SPFSmHqJ625p8Xeousa72W00ZfmQ%2F6e8xgF5G78vINoqU%2FsgAey5mR%2FVNEYiRVDmOr%2F6J1miHuO1EOIPjXrVbXVfPqfVpYPcXosOn%2BHjApF7%2F%2FB3wx5k9DjIzkT%2F7Fl76jtfrtq8WTYgC3%2BLrwh7Cdi0stYL8%2FnQc3PAb3rEk%2FsLlvlqYkDDDX3pPNBjqlAdqe29VSHo0X48NwqUqel%2B%2BeEhfk637tBSTVIVnqF7idulZ%2F5%2BJ08j%2Fg4Kc1WdU%2BNPh%2Fa4UijTo7DnFyf20F1bvKXtKxmxg%2BE%2FIVqbokk8qEM2NuL%2BF6soIsuNKK%2BcmRN8Jp51ceMwvJ7i5b6Ly8NR%2FsekLjI8S4AxUDQbQB%2B%2Bfn32VrmfXA%2F0rmHPbrhFN%2Bw9HHwww1J2T2qkTIXA5xsF1DzbGsXA%3D%3D&X-Amz-SignedHeaders=host&x-id=GetObject&X-Amz-Signature=071e3caef1fce53415c33ecce86cd14a15f979e2ef32d7a01dc489fd91d51fb0. If you want to process broken responses set the setting DOWNLOAD_FAIL_ON_DATALOSS = False -- This message won't be shown in further requests 2026-03-02 07:32:56 [py.warnings] WARNING: /home/ubuntu/restaverse_spiders/eggs/zomato_order_history/1770610074.egg/zomato_order_history/spiders/discounts.py:378: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy df_delivered["Discount construct"] = df_delivered[ 2026-03-02 07:32:56 [scrapy.core.engine] INFO: Closing spider (finished) 2026-03-02 07:32:57 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'client_id': '70', 'downloader/exception_count': 1, 'downloader/exception_type_count/twisted.web._newclient.ResponseFailed': 1, 'downloader/request_bytes': 27667, 'downloader/request_count': 7, 'downloader/request_method_count/GET': 6, 'downloader/request_method_count/POST': 1, 'downloader/response_bytes': 792686, 'downloader/response_count': 6, 'downloader/response_status_count/200': 6, 'elapsed_time_seconds': 105.987497, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2026, 3, 2, 7, 32, 56, 888944, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 318866, 'httpcompression/response_count': 5, 'item_scraped_count': 1, 'items_per_minute': None, 'log_count/INFO': 11, 'log_count/WARNING': 2, 'memusage/max': 128266240, 'memusage/startup': 114765824, 'request_depth_max': 5, 'response_received_count': 6, 'responses_per_minute': None, 'retry/count': 1, 'retry/reason_count/twisted.web._newclient.ResponseFailed': 1, 'scheduler/dequeued': 7, 'scheduler/dequeued/memory': 7, 'scheduler/enqueued': 7, 'scheduler/enqueued/memory': 7, 'start_time': datetime.datetime(2026, 3, 2, 7, 31, 10, 901447, tzinfo=datetime.timezone.utc)} 2026-03-02 07:32:57 [scrapy.core.engine] INFO: Spider closed (finished)