2026-01-06 07:31:39 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: zomato_order_history) 2026-01-06 07:31:39 [scrapy.utils.log] INFO: Versions: lxml 5.3.0.0, libxml2 2.12.9, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.11.0, Python 3.10.12 (main, Nov 4 2025, 08:48:33) [GCC 11.4.0], pyOpenSSL 24.3.0 (OpenSSL 3.4.0 22 Oct 2024), cryptography 44.0.0, Platform Linux-6.8.0-1044-aws-aarch64-with-glibc2.35 2026-01-06 07:31:39 [zomato_discounts] INFO: Dynamic attribute _job = 99_2026-01-06T07_30_02 2026-01-06 07:31:39 [zomato_discounts] INFO: Dynamic attribute scheduled_job_id = 7379 2026-01-06 07:31:39 [zomato_discounts] INFO: Dynamic attribute SCRAPEOPS_JOB_NAME = 99 2026-01-06 07:31:39 [scrapy.addons] INFO: Enabled addons: [] 2026-01-06 07:31:39 [scrapy.extensions.telnet] INFO: Telnet Password: 5044723b8e8bf771 2026-01-06 07:31:39 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapy_extensions.extension.BandwidthLoggerExtension', 'scrapeops_scrapy.extension.ScrapeOpsMonitor'] 2026-01-06 07:31:39 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'zomato_order_history', 'CONCURRENT_REQUESTS_PER_DOMAIN': 1, 'DOWNLOAD_DELAY': 0.1, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/home/ubuntu/restaverse_spiders/logs/zomato_order_history/zomato_discounts/99_2026-01-06T07_30_02.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'zomato_order_history.spiders', 'SPIDER_MODULES': ['zomato_order_history.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} 2026-01-06 07:31:39 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapeops_scrapy.middleware.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2026-01-06 07:31:39 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2026-01-06 07:31:39 [scrapy.middleware] INFO: Enabled item pipelines: ['zomato_order_history.pipelines.ZomatoOrderHistoryPipeline'] 2026-01-06 07:31:39 [scrapy.core.engine] INFO: Spider opened 2026-01-06 07:31:39 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2026-01-06 07:31:55 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2026-01-06 07:32:39 [scrapy.extensions.logstats] INFO: Crawled 5 pages (at 5 pages/min), scraped 0 items (at 0 items/min) 2026-01-06 07:33:12 [scrapy.core.downloader.handlers.http11] WARNING: Got data loss in https://prod-jumbo-insights-service.s3.ap-southeast-1.amazonaws.com/merchant-api-gateway/mx_order_history_download_v4/v1/merchant-api-gateway%3A%3Amx_order_history_download_v4%3A%3Av1%3A%3A133547546177441667%3A%3A1767684600000/order_history_20260105_20260105.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Checksum-Mode=ENABLED&X-Amz-Credential=ASIA5RVPIJ4E5INMPOEW%2F20260106%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20260106T073204Z&X-Amz-Expires=7200&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEIz%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDmFwLXNvdXRoZWFzdC0xIkcwRQIgW63rmSlcIt4a2PUWualY6eCi0hYpmA%2FK2NeGg4T5WOcCIQDIhCwpAgbWoDeNp6XVUMPnmXRby0RimNeNkUn7l2vbsir%2BAwhVEAAaDDkzMTMwMTcwNzUyOSIMr1CvoHF%2FGpjn1EQhKtsD20v94PnQEoYPEiyQS017VuAAd5G6CidTiSaYL1sntL7iYotlJfuM4qLLDGZUeNQTp22eMrkDsWwKW%2BvWNYu6SEsoCOAHWw8vU9fIC7mvOX67lf%2FC7a%2Fm1E%2FOUadogidVc4JhogAi5RVgjx%2By44l9marjBb1y3KWNpl9zQvB8jG5GxJijO6LSVlIJKQZ45CACbQ%2FcBgHe52UlFOe3MNcWvkPvN6TzqZyS6dDVb%2B%2FigxZhQ7R6D5wrcBIm2eJ6T%2FWJccX7YCw%2B8pA%2BRplKn3moff1Pzn0KOIOPnQqq3BhcKznWFLesAFCTd4pPcldi9nf9tHhaUexWHg0Di4IiPJ4NdY8vsjmsoEogICAm6JzIws51Ueybv2dqmXaG5EwayGLcLyHQhqE8GTRwZwVKyj7z5XGxsWvM9hW2wPe01M7J%2FcJg70onUADbY1jJzzahlHQZlzwPSYQB7S6Ig8ki0LWycjaqqIwxQVycLVQsEP5a5W6weLVWwSYT%2BGQa%2FS17XgyASVfV%2Byl16S6Wa8Zl2q9DfKzvRwVAYhf73fnC5WKVObB9XVNTelP%2FBLDSfGoaL%2FkNuULNtkzrXT9L9H4S2OkvbsAISw4wNbG%2Fmj88u2SrfrYgl410OpTUppd5SzCQk%2FLKBjqlAe4BWkmGPQ7zxY6raH%2Fh%2F6%2Fe6MnIhqNtcoJ0l7cAnMdNfg6ah69Kz9ATxM8aBRaOGLOcRvRhDpBeY39W8kANnXEvtdXVkNMiGXHSSYRDhPnIxRAOYLPQbs7E7NK8DmRbI7ZSyHeaGLFk7ObSHyQaOhpSCNy5UzL9F8pUhuIXXTgRo3N4DDSeu%2FziLAkaIi4LGG%2BHfRW%2B5NjzChoDueDdidp1ZlHFeQ%3D%3D&X-Amz-SignedHeaders=host&x-id=GetObject&X-Amz-Signature=a089c456f560c13830dbb3c24e3e2250a7b7e3f5c4e2081ed481188aa22cca7e. If you want to process broken responses set the setting DOWNLOAD_FAIL_ON_DATALOSS = False -- This message won't be shown in further requests 2026-01-06 07:33:39 [scrapy.extensions.logstats] INFO: Crawled 5 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2026-01-06 07:34:21 [scrapy.core.downloader.handlers.http11] WARNING: Got data loss in https://prod-jumbo-insights-service.s3.ap-southeast-1.amazonaws.com/merchant-api-gateway/mx_order_history_download_v4/v1/merchant-api-gateway%3A%3Amx_order_history_download_v4%3A%3Av1%3A%3A133547546177441667%3A%3A1767684600000/order_history_20260105_20260105.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Checksum-Mode=ENABLED&X-Amz-Credential=ASIA5RVPIJ4E5INMPOEW%2F20260106%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20260106T073204Z&X-Amz-Expires=7200&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEIz%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDmFwLXNvdXRoZWFzdC0xIkcwRQIgW63rmSlcIt4a2PUWualY6eCi0hYpmA%2FK2NeGg4T5WOcCIQDIhCwpAgbWoDeNp6XVUMPnmXRby0RimNeNkUn7l2vbsir%2BAwhVEAAaDDkzMTMwMTcwNzUyOSIMr1CvoHF%2FGpjn1EQhKtsD20v94PnQEoYPEiyQS017VuAAd5G6CidTiSaYL1sntL7iYotlJfuM4qLLDGZUeNQTp22eMrkDsWwKW%2BvWNYu6SEsoCOAHWw8vU9fIC7mvOX67lf%2FC7a%2Fm1E%2FOUadogidVc4JhogAi5RVgjx%2By44l9marjBb1y3KWNpl9zQvB8jG5GxJijO6LSVlIJKQZ45CACbQ%2FcBgHe52UlFOe3MNcWvkPvN6TzqZyS6dDVb%2B%2FigxZhQ7R6D5wrcBIm2eJ6T%2FWJccX7YCw%2B8pA%2BRplKn3moff1Pzn0KOIOPnQqq3BhcKznWFLesAFCTd4pPcldi9nf9tHhaUexWHg0Di4IiPJ4NdY8vsjmsoEogICAm6JzIws51Ueybv2dqmXaG5EwayGLcLyHQhqE8GTRwZwVKyj7z5XGxsWvM9hW2wPe01M7J%2FcJg70onUADbY1jJzzahlHQZlzwPSYQB7S6Ig8ki0LWycjaqqIwxQVycLVQsEP5a5W6weLVWwSYT%2BGQa%2FS17XgyASVfV%2Byl16S6Wa8Zl2q9DfKzvRwVAYhf73fnC5WKVObB9XVNTelP%2FBLDSfGoaL%2FkNuULNtkzrXT9L9H4S2OkvbsAISw4wNbG%2Fmj88u2SrfrYgl410OpTUppd5SzCQk%2FLKBjqlAe4BWkmGPQ7zxY6raH%2Fh%2F6%2Fe6MnIhqNtcoJ0l7cAnMdNfg6ah69Kz9ATxM8aBRaOGLOcRvRhDpBeY39W8kANnXEvtdXVkNMiGXHSSYRDhPnIxRAOYLPQbs7E7NK8DmRbI7ZSyHeaGLFk7ObSHyQaOhpSCNy5UzL9F8pUhuIXXTgRo3N4DDSeu%2FziLAkaIi4LGG%2BHfRW%2B5NjzChoDueDdidp1ZlHFeQ%3D%3D&X-Amz-SignedHeaders=host&x-id=GetObject&X-Amz-Signature=a089c456f560c13830dbb3c24e3e2250a7b7e3f5c4e2081ed481188aa22cca7e. If you want to process broken responses set the setting DOWNLOAD_FAIL_ON_DATALOSS = False -- This message won't be shown in further requests 2026-01-06 07:34:26 [py.warnings] WARNING: /home/ubuntu/restaverse_spiders/eggs/zomato_order_history/1765970632.egg/zomato_order_history/spiders/discounts.py:329: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy df_delivered['Discount construct'] = df_delivered['Discount construct'].fillna("Restaurant discount (Flat offs, Freebies & others)") 2026-01-06 07:34:26 [scrapy.core.engine] INFO: Closing spider (finished) 2026-01-06 07:34:26 [zomato_discounts] INFO: Logger Payload: {'run_id': '41e2f966-eee0-4d3d-93f2-10ca745472cc', 'timestamp': '2026-01-06T07:34:26Z', 'spider': 'zomato_discounts', 'client_id': '99', 'domain': 'www.zomato.com', 'bytes_sent': 0, 'bytes_received': 581, 'duration_seconds': 166.83, 'host': 'ip-172-31-16-168', 'status': 'finished'} 2026-01-06 07:34:26 [zomato_discounts] INFO: Logger Payload: {'run_id': '41e2f966-eee0-4d3d-93f2-10ca745472cc', 'timestamp': '2026-01-06T07:34:26Z', 'spider': 'zomato_discounts', 'client_id': '99', 'domain': 'api.zomato.com', 'bytes_sent': 3303, 'bytes_received': 281766, 'duration_seconds': 166.83, 'host': 'ip-172-31-16-168', 'status': 'finished'} 2026-01-06 07:34:26 [zomato_discounts] INFO: Logger Payload: {'run_id': '41e2f966-eee0-4d3d-93f2-10ca745472cc', 'timestamp': '2026-01-06T07:34:26Z', 'spider': 'zomato_discounts', 'client_id': '99', 'domain': 'prod-jumbo-insights-service.s3.ap-southeast-1.amazonaws.com', 'bytes_sent': 0, 'bytes_received': 462217, 'duration_seconds': 166.83, 'host': 'ip-172-31-16-168', 'status': 'finished'} 2026-01-06 07:34:27 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'client_id': '99', 'downloader/exception_count': 2, 'downloader/exception_type_count/twisted.web._newclient.ResponseFailed': 2, 'downloader/request_bytes': 31574, 'downloader/request_count': 8, 'downloader/request_method_count/GET': 7, 'downloader/request_method_count/POST': 1, 'downloader/response_bytes': 753937, 'downloader/response_count': 6, 'downloader/response_status_count/200': 6, 'elapsed_time_seconds': 166.830142, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2026, 1, 6, 7, 34, 26, 625369, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 581, 'httpcompression/response_count': 2, 'item_scraped_count': 1, 'items_per_minute': None, 'log_count/INFO': 15, 'log_count/WARNING': 3, 'memusage/max': 125378560, 'memusage/startup': 112271360, 'request_depth_max': 5, 'response_received_count': 6, 'responses_per_minute': None, 'retry/count': 2, 'retry/reason_count/twisted.web._newclient.ResponseFailed': 2, 'scheduler/dequeued': 8, 'scheduler/dequeued/memory': 8, 'scheduler/enqueued': 8, 'scheduler/enqueued/memory': 8, 'start_time': datetime.datetime(2026, 1, 6, 7, 31, 39, 795227, tzinfo=datetime.timezone.utc)} 2026-01-06 07:34:27 [scrapy.core.engine] INFO: Spider closed (finished)