2026-04-29 16:28:48 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: swiggy) 2026-04-29 16:28:48 [scrapy.utils.log] INFO: Versions: lxml 5.3.0.0, libxml2 2.12.9, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.11.0, Python 3.10.12 (main, Mar 3 2026, 11:56:32) [GCC 11.4.0], pyOpenSSL 24.3.0 (OpenSSL 3.4.0 22 Oct 2024), cryptography 44.0.0, Platform Linux-6.8.0-1044-aws-aarch64-with-glibc2.35 2026-04-29 16:28:48 [twisted] CRITICAL: Unhandled error in Deferred: 2026-04-29 16:28:48 [twisted] CRITICAL: Traceback (most recent call last): File "/home/ubuntu/restaverse_spiders/venv/lib/python3.10/site-packages/twisted/internet/defer.py", line 2017, in _inlineCallbacks result = context.run(gen.send, result) File "/home/ubuntu/restaverse_spiders/venv/lib/python3.10/site-packages/scrapy/crawler.py", line 149, in crawl self.spider = self._create_spider(*args, **kwargs) File "/home/ubuntu/restaverse_spiders/venv/lib/python3.10/site-packages/scrapy/crawler.py", line 163, in _create_spider return self.spidercls.from_crawler(self, *args, **kwargs) File "/home/ubuntu/restaverse_spiders/venv/lib/python3.10/site-packages/scrapy/spiders/__init__.py", line 66, in from_crawler spider = cls(*args, **kwargs) TypeError: SwiggyReviews.__init__() got an unexpected keyword argument '_job'