2026-03-01 07:31:03 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: zomato_order_history) 2026-03-01 07:31:03 [scrapy.utils.log] INFO: Versions: lxml 5.3.0.0, libxml2 2.12.9, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.11.0, Python 3.10.12 (main, Jan 26 2026, 14:55:28) [GCC 11.4.0], pyOpenSSL 24.3.0 (OpenSSL 3.4.0 22 Oct 2024), cryptography 44.0.0, Platform Linux-6.8.0-1044-aws-aarch64-with-glibc2.35 2026-03-01 07:31:03 [zomato_discounts] INFO: Dynamic attribute _job = 147_2026-03-01T07_30_01 2026-03-01 07:31:03 [zomato_discounts] INFO: Dynamic attribute scheduled_job_id = 7388 2026-03-01 07:31:03 [zomato_discounts] INFO: Dynamic attribute SCRAPEOPS_JOB_NAME = 147 2026-03-01 07:31:03 [scrapy.addons] INFO: Enabled addons: [] 2026-03-01 07:31:03 [scrapy.extensions.telnet] INFO: Telnet Password: efa229bd5ad74355 2026-03-01 07:31:03 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'scrapeops_scrapy.extension.ScrapeOpsMonitor'] 2026-03-01 07:31:03 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'zomato_order_history', 'CONCURRENT_REQUESTS': 8, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/home/ubuntu/restaverse_spiders/logs/zomato_order_history/zomato_discounts/147_2026-03-01T07_30_01.log', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'zomato_order_history.spiders', 'RETRY_HTTP_CODES': [500, 502, 503, 504], 'RETRY_TIMES': 1, 'SPIDER_MODULES': ['zomato_order_history.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} 2026-03-01 07:31:06 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware', 'zomato_order_history.middlewares.ZomatoCentralRateLimitMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy_user_agents.middlewares.RandomUserAgentMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapeops_scrapy.middleware.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2026-03-01 07:31:06 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2026-03-01 07:31:06 [scrapy.middleware] INFO: Enabled item pipelines: ['zomato_order_history.pipelines.ZomatoOrderHistoryPipeline'] 2026-03-01 07:31:06 [scrapy.core.engine] INFO: Spider opened 2026-03-01 07:31:06 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2026-03-01 07:31:07 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6026 2026-03-01 07:32:07 [scrapy.extensions.logstats] INFO: Crawled 5 pages (at 5 pages/min), scraped 0 items (at 0 items/min) 2026-03-01 07:32:35 [scrapy.core.downloader.handlers.http11] WARNING: Got data loss in https://prod-jumbo-insights-service.s3.ap-southeast-1.amazonaws.com/merchant-api-gateway/mx_order_history_download_v4/v1/merchant-api-gateway%3A%3Amx_order_history_download_v4%3A%3Av1%3A%3A5612212744106289195%3A%3A1772350200000/order_history_20260228_20260228.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Checksum-Mode=ENABLED&X-Amz-Credential=ASIA5RVPIJ4EWM5XPKAO%2F20260301%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20260301T073118Z&X-Amz-Expires=7200&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEJv%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDmFwLXNvdXRoZWFzdC0xIkcwRQIhAI8EN2Tl4AlEzKDBnFCvWcdsPo4%2B9PiHaAeKEDytyVgQAiBlTrbOjZ26fLeCXQQW5U2ZG57p9mXxd%2BYtbMgZ0kI%2Fhir%2BAwhkEAAaDDkzMTMwMTcwNzUyOSIMWNOO3d9kHnCL45VaKtsDxPsIJIM8%2FPlYRPrgEyPfpY5Nwr0YEVmj0V1INksPkdEv0iDK0a22FhQGzqOn4xuUAlCouDkswl2evCpUJlHOw4rc0BlTp6LNUZBrGbNbxCmGy%2BXZ721bvzmPva3aGRX36h8QmdN9XopLSgh%2FhdTEhOap6URgCLsCnouYvY6hbrxkuTybDX7YrBxItv4TpVHTyMUoC%2Bs2RCK%2B7g8lUxYQ%2BqMIrw3l%2Fkgx%2FihWlpc93Gz3rsVfHpqhX8Se%2Bbq%2FHD9hwXUdd1xPjpu2jqWj5iVwvv%2FCyBpoiR5gTpuRDc1jVjvg2PG7r%2F%2BDqXj9Ucc2xIyQkQssjHHCOlTvv%2F4N7BNA5VS%2BkOPUDF3kEbljxGHVyTJctaoq7DWoQTldU5vq3FCpYcZtbrsCGUitnuvuIampJXXU4NbPFDgLd3PEI%2BbObTyjdhke6jdg%2Bf2Ygpl3RQfv%2BPCY%2Bp3XU1bZB0brPR5uP44f8rwjefoe62nkTnMZuOfyCe35iLS4CCu8PXAfzOZVvLJkACmAyRHZFYpyr48otPID04aw%2Fd2ZDWVokQz5gfh92SQFY%2FCTUqsPSFXLw26kmBVhCPfgYbyCwDfkTML85eVNPLH%2FWFkclYcDGgHQKgLLUTQK9a022%2BnvCzCzyI7NBjqlAUWQx9P7xdK9xc7xt6QCLZmXnyzHy9llMcUgRmX7p3KuMf0ppMhhpjy35I6SB3pAblq3IoXpgPzPv%2FhXh9sXP79B76DxDG21ypMFlbP2ooBTrifmFWWGGvnuWzQLCSFfcIUv4mSxUWasMPl2WevDrr7HfX852wCZPfi48YQMC0G7hIKBFSv9nmKJnRc5XYSCy2jSPdUaXr6ZMGTU%2FurCF5pV4vq5kw%3D%3D&X-Amz-SignedHeaders=host&x-id=GetObject&X-Amz-Signature=188850ea3bb419e6e076b824b136c371f2c3d6aed5e8adca9dc3effb0455e7c5. If you want to process broken responses set the setting DOWNLOAD_FAIL_ON_DATALOSS = False -- This message won't be shown in further requests 2026-03-01 07:32:57 [py.warnings] WARNING: /home/ubuntu/restaverse_spiders/eggs/zomato_order_history/1770610074.egg/zomato_order_history/spiders/discounts.py:378: SettingWithCopyWarning: A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy df_delivered["Discount construct"] = df_delivered[ 2026-03-01 07:32:57 [scrapy.core.engine] INFO: Closing spider (finished) 2026-03-01 07:32:57 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'client_id': '147', 'downloader/exception_count': 1, 'downloader/exception_type_count/twisted.web._newclient.ResponseFailed': 1, 'downloader/request_bytes': 28071, 'downloader/request_count': 7, 'downloader/request_method_count/GET': 6, 'downloader/request_method_count/POST': 1, 'downloader/response_bytes': 694445, 'downloader/response_count': 6, 'downloader/response_status_count/200': 6, 'elapsed_time_seconds': 110.619219, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2026, 3, 1, 7, 32, 57, 375414, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 318865, 'httpcompression/response_count': 5, 'item_scraped_count': 1, 'items_per_minute': None, 'log_count/INFO': 11, 'log_count/WARNING': 2, 'memusage/max': 128274432, 'memusage/startup': 114905088, 'request_depth_max': 5, 'response_received_count': 6, 'responses_per_minute': None, 'retry/count': 1, 'retry/reason_count/twisted.web._newclient.ResponseFailed': 1, 'scheduler/dequeued': 7, 'scheduler/dequeued/memory': 7, 'scheduler/enqueued': 7, 'scheduler/enqueued/memory': 7, 'start_time': datetime.datetime(2026, 3, 1, 7, 31, 6, 756195, tzinfo=datetime.timezone.utc)} 2026-03-01 07:32:57 [scrapy.core.engine] INFO: Spider closed (finished)