![]() # Path to the folder containing Airflow plugins Sql_alchemy_conn = postgresql+psycopg2://airflow: :5432/airflow # sql_alchemy_conn = sqlite:////user/athome/utils/airflow/airflow.db WARNI Install kubernetes dependencies with: pip install apache-airflowĬd "설정하고자 하는 디렉토리" airflow.cfg 변경하기 # modified by jjin.choi.WARNI Could not import DAGs in example_kubernetes_executor_config.py: No module named 'kubernetes'.But I guess it will not fix the problem when I will process much more bigger data.필요한 module 이 없을 경우 아래와 같은 에러가 발생하므로 (httpx pypi 로 검색하여 install)Įx) pkg_resources.DistributionNotFound: The 'httpx' distribution was not found and is required by apache-airflow My first idea was to increase the time between heartsbeats so my worker can finish its treatment and then heartbeat. So, I guess my worker is unable to heartbeat because he is processing some treatments that take more than 300 seconds. I read this documentation and it helps me to understand a little better. It seems the worker was killed because it was unable to heartbeat within 300 seconds. 16:57:27,306 - distributed.nanny - WARNING - Worker process still alive after 3.9999959945678714 seconds, killing 16:57:27,304 - distributed.nanny - WARNING - Worker process still alive after 3.9999969482421878 seconds, killing 16:57:27,304 - distributed.nanny - WARNING - Worker process still alive after 3.9999965667724613 seconds, killing Results = self.gather(packed, asynchronous=asynchronous, direct=direct)įile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\distributed\client.py", line 2174, in gatherįile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\distributed\utils.py", line 336, in syncįile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\distributed\utils.py", line 403, in syncįile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\distributed\utils.py", line 376, in fįile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\tornado\gen.py", line 762, in runįile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\distributed\client.py", line 2037, in _gather Return ctx.invoke(self.callback, **ctx.params)įile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 760, in invokeįile "D:\applications\miniconda3\envs\pdal_env\lib\site-packages\pdal_parallelizer\_main_.py", line 42, in process_pipelinesįile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\dask\base.py", line 602, in computeįile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\distributed\client.py", line 3000, in get Return _process_result(sub_(sub_ctx))įile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 1404, in invoke Return _run_code(code, main_globals, None,įile "D:\applications\miniconda3\envs\pdal_env\lib\runpy.py", line 87, in _run_codeįile "D:\applications\miniconda3\envs\pdal_env\Scripts\pdal-parallelizer.exe\_main_.py", line 7, in įile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 1130, in _call_įile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 1055, in mainįile "C:\Users\gguidirontani\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 1657, in invoke ![]() Closing: įile "D:\applications\miniconda3\envs\pdal_env\lib\runpy.py", line 197, in _run_module_as_main 16:57:23,145 - distributed.scheduler - WARNING - Worker failed to heartbeat within 300 seconds. Here is the behaviour of my workers before they are killed : 16:57:23,145 - distributed.scheduler - WARNING - Worker failed to heartbeat within 300 seconds. But as soon as try to process bigger files, my workers are killed. I’m using dask distributed to parallelize and on little treatments, my code works perfectly. ![]() I’m currently using dask to parallelize some pdal pipelines but I face a problem. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |