我有一个Django项目使用cookiecutter的模板,我一直试图让celery 与它一起工作。我已经根据celery 文档完成了设置,当我运行一个任务时,Django的输出显示为成功,但如果我检查运行celery 的终端,celery 日志甚至不显示它是否收到了任务,我使用celery -A proj_name worker -l DEBUG运行它。我也试过INFO,但同样的事情。任务也没有显示在花卉 Jmeter 板上,我使用的是django-celery-results与redis/postgres后端,两者都没有得到填充结果。
我不是很确定到底发生了什么,但据我所知,celery 根本没有收到任务,尽管Django的日志显示了什么。此外,当我尝试使用AsyncResult打印任务的状态时,它总是显示PENDING,尽管django再次表示成功。
这是我的celery.py
import osfrom celery import Celeryos.environ.setdefault("DJANGO_SETTINGS_MODULE", "proj_name.config.settings.local")app = Celery("proj_name")app.config_from_object("django.conf:settings", namespace="CELERY")app.autodiscover_tasks()@app.task(bind=True)def debug_task(self): print('Request: {0!r}'.format(self.request))
字符串
和我的celery 相关配置
if USE_TZ: CELERY_TIMEZONE = TIME_ZONECELERY_BROKER_URL = env("CELERY_BROKER_URL", default="redis://localhost:6379/0")# CELERY_RESULT_BACKEND = f"db+postgresql://{env('POSTGRES_DB_USER')}:{env('POSTGRES_DB_PWD')}@localhost/{env('POSTGRES_DB_NAME')}"CELERY_RESULT_BACKEND = CELERY_BROKER_URLCELERY_CACHE_BACKEND = 'django-cache'CELERY_TASK_TRACK_STARTED = TrueCELERY_RESULT_EXTENDED = TrueCELERY_RESULT_BACKEND_ALWAYS_RETRY = TrueCELERY_RESULT_BACKEND_MAX_RETRIES = 10CELERY_ACCEPT_CONTENT = ["application/json"]CELERY_TASK_SERIALIZER = "json"CELERY_RESULT_SERIALIZER = "json"CELERY_TASK_TIME_LIMIT = 5 * 60CELERY_TASK_SOFT_TIME_LIMIT = 60CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"CELERY_WORKER_SEND_TASK_EVENTS = TrueCELERY_TASK_SEND_SENT_EVENT = TrueCELERY_TASK_ALWAYS_EAGER = TrueCELERY_TASK_EAGER_PROPAGATES = True
型
我的任务是这样定义的:
@app.task(bind=True)def send_email_task(self,email_action, recipient_email = "", context={}, attachments = [], is_sender_email_dynamic = False, dynamic_cc_emails = []): print('before task',self.request.id, self.request,) print(self.AsyncResult(self.request.id).state) # insert async func call to send email print('after task',self.AsyncResult(self.request.id).state, self.request)
型
和一份 Django 日志中的celery 样本
INFO 2023-07-24 03:01:54,640 trace 7804 140679160931904 Task email_services.tasks.send_email_task[5fbbc289-f337-415c-8fad-3100042c422a] succeeded in 0.08931779300019116s: None个
我的celery celery -A proj_name worker -l DEBUG输出
(venv_name) ➜ proj_name git:(main) ✗ celery -A proj_name worker -l DEBUG[2023-07-24 02:39:31,265: DEBUG/MainProcess] | Worker: Preparing bootsteps.[2023-07-24 02:39:31,266: DEBUG/MainProcess] | Worker: Building graph...[2023-07-24 02:39:31,266: DEBUG/MainProcess] | Worker: New boot order: {Beat, Timer, Hub, Pool, Autoscaler, StateDB, Consumer}[2023-07-24 02:39:31,268: DEBUG/MainProcess] | Consumer: Preparing bootsteps.[2023-07-24 02:39:31,268: DEBUG/MainProcess] | Consumer: Building graph...[2023-07-24 02:39:31,274: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Mingle, Gossip, Heart, Agent, Tasks, Control, event loop} -------------- celery@DESKTOP-SGH5F1FL v5.3.1 (emerald-rush)--- ***** ------- ******* ---- Linux-5.10.16.3-microsoft-standard-WSL2-x86_64-with-glibc2.35 2023-07-24 02:39:31- *** --- * ---- ** ---------- [config]- ** ---------- .> app: proj_name:0x7fe89adf1180- ** ---------- .> transport: redis://localhost:6379/0- ** ---------- .> results: redis://localhost:6379/0- *** --- * --- .> concurrency: 20 (prefork)-- ******* ---- .> task events: ON--- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery[tasks] . celery.accumulate . celery.backend_cleanup . celery.chain . celery.chord . celery.chord_unlock . celery.chunks . celery.group . celery.map . celery.starmap . email_services.tasks.send_email_task . proj_name.celery.debug_task[2023-07-24 02:39:31,280: DEBUG/MainProcess] | Worker: Starting Hub[2023-07-24 02:39:31,280: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:31,280: DEBUG/MainProcess] | Worker: Starting Pool[2023-07-24 02:39:32,539: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:32,539: DEBUG/MainProcess] | Worker: Starting Consumer[2023-07-24 02:39:32,539: DEBUG/MainProcess] | Consumer: Starting Connection[2023-07-24 02:39:32,541: WARNING/MainProcess] /home/<name>/.pyenv/versions/3.10.10/envs/<proj_name>/lib/python3.10/site-packages/celery/worker/consumer/consumer.py:498: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determinewhether broker connection retries are made during startup in Celery 6.0 and above.If you wish to retain the existing behavior for retrying connections on startup,you should set broker_connection_retry_on_startup to True. warnings.warn([2023-07-24 02:39:32,545: INFO/MainProcess] Connected to redis://localhost:6379/0[2023-07-24 02:39:32,545: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:32,545: DEBUG/MainProcess] | Consumer: Starting Events[2023-07-24 02:39:32,545: WARNING/MainProcess] /home/<name>/.pyenv/versions/3.10.10/envs/proj_name/lib/python3.10/site-packages/celery/worker/consumer/consumer.py:498: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determinewhether broker connection retries are made during startup in Celery 6.0 and above.If you wish to retain the existing behavior for retrying connections on startup,you should set broker_connection_retry_on_startup to True. warnings.warn([2023-07-24 02:39:32,546: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:32,546: DEBUG/MainProcess] | Consumer: Starting Mingle[2023-07-24 02:39:32,546: INFO/MainProcess] mingle: searching for neighbors[2023-07-24 02:39:33,552: INFO/MainProcess] mingle: all alone[2023-07-24 02:39:33,552: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:33,553: DEBUG/MainProcess] | Consumer: Starting Gossip[2023-07-24 02:39:33,555: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:33,555: DEBUG/MainProcess] | Consumer: Starting Heart[2023-07-24 02:39:33,557: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:33,557: DEBUG/MainProcess] | Consumer: Starting Tasks[2023-07-24 02:39:33,560: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:33,560: DEBUG/MainProcess] | Consumer: Starting Control[2023-07-24 02:39:33,562: DEBUG/MainProcess] ^-- substep ok[2023-07-24 02:39:33,562: DEBUG/MainProcess] | Consumer: Starting event loop[2023-07-24 02:39:33,562: DEBUG/MainProcess] | Worker: Hub.register Pool...[2023-07-24 02:39:33,562: INFO/MainProcess] celery@DESKTOP-SGH5F1FL ready.[2023-07-24 02:39:33,563: DEBUG/MainProcess] basic.qos: prefetch_count->80
型
相关知识
SKYNE/python
仙花日志花店加盟费,仙花日志连锁加盟
《花小美芳香疗法大百科》之芹菜精油的功效与作用
詹丽娟
仙花日志花店连锁,面向全国开放加盟
嫦娥四号在月球上成功培育棉花!曾计划小携带乌龟上月球被否决!
种植日志:哪些花卉种类能叶插?适合叶插繁殖的盆栽花卉有哪些?
计算机毕业设计django基于python鲜花培育专家系统
花卉在园林园艺中的应用
增加市花显示度 激活文化认同 本市将建白玉兰文化公园
网址: celery 任务在Django中成功,但在花卉或celery 日志上没有显示 https://m.huajiangbk.com/newsview104474.html
上一篇: 智慧城市之智慧园林信息系统解决方 |
下一篇: 在ant设计中选择日期后,无法清 |