-
Notifications
You must be signed in to change notification settings - Fork 635
Call target_link_options_shared_lib in executorch-config.cmake #12320
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Stack from ghstack (oldest at bottom): |
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/12320
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 Cancelled Job, 3 Unrelated FailuresAs of commit de72b49 with merge base bf4acb3 ( CANCELLED JOB - The following job was cancelled. Please retry:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
BROKEN TRUNK - The following jobs failed but was present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Customers need the flags this sets, otherwise the built operator libraries won't actually register operators. Testing: I'm not sure that we have automated coverage for this file, so I'm sending a first version adding an intentional break. ghstack-source-id: d3a9d52 ghstack-comment-id: 3054284839 Pull-Request: #12320
[ghstack-poisoned]
Customers need the flags this sets, otherwise the built operator libraries won't actually register operators. Testing: I'm not sure that we have automated coverage for this file, so I'm sending a first version adding an intentional break. ghstack-source-id: fc72c1b ghstack-comment-id: 3054284839 Pull-Request: #12320
[ghstack-poisoned]
Customers need the flags this sets, otherwise the built operator libraries won't actually register operators. Testing: I'm not sure that we have automated coverage for this file, so I'm sending a first version adding an intentional break. ghstack-source-id: 1ce1a47 ghstack-comment-id: 3054284839 Pull-Request: #12320
size regression seems to be several KB, need to investigate |
[ghstack-poisoned]
Customers need the flags this sets, otherwise the built operator libraries won't actually register operators. Testing: I'm not sure that we have automated coverage for this file, so I'm sending a first version adding an intentional break. ghstack-source-id: 31c2b55 ghstack-comment-id: 3054284839 Pull-Request: #12320
binary size tests are passing. looks like a lot of jobs were cancelled and we just need to retry |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you this makes sense. We can also remove the redundant target_link_options_shared_lib in a lot of subdirectories as well.
|
This error is what called redundant: https://github.com/pytorch/executorch/actions/runs/16207706949/job/45761626062 |
After pytorch/executorch#12320 we started to add `--whole-archive` to libraries in `executorch-config.cmake`. This means when we `find_packages(executoch)` and link `${EXECUTORCH_LIBRARIES}`, we started to actually link portable libs: `portable_ops_lib`. This behavior change is breaking because outside of experimental when we build the `llama_runner` library we are linking `optimized_native_cpu_ops_lib` (which contains exactly the same ops but different kernels) as well: https://github.com/pytorch/executorch/blob/main/examples/models/llama/CMakeLists.txt#L95-L101 This results in runtime error: ``` + ./cmake-out/examples/models/llama/llama_main --model_path=model.pte --tokenizer_path=tokenizer.bin '--prompt=Once upon a time,' I tokenizers:regex.cpp:27] Registering override fallback regex E 00:00:00.000453 executorch:operator_registry.cpp:89] Re-registering aten::_cdist_forward.out, from NOT_SUPPORTED I 00:00:00.000494 executorch:operator_registry.cpp:90] key: (null), is_fallback: true F 00:00:00.000497 executorch:operator_registry.cpp:114] In function register_kernels(), assert failed (false): Kernel registration failed with error 18, see error log for details. ``` Like in this job: https://github.com/pytorch/executorch/actions/runs/16207706949/job/45761626062 In order to fix this, we change the logic of linking `${EXECUTORCH_LIBRARIES}` which includes both core and kernel ops lib into only linking `executorch_core`, because the kernel ops lib will be linked outside anyways.
After pytorch/executorch#12320 we started to add `--whole-archive` to libraries in `executorch-config.cmake`. This means when we `find_packages(executoch)` and link `${EXECUTORCH_LIBRARIES}`, we started to actually link portable libs: `portable_ops_lib`. This behavior change is breaking because outside of experimental when we build the `llama_runner` library we are linking `optimized_native_cpu_ops_lib` (which contains exactly the same ops but different kernels) as well: https://github.com/pytorch/executorch/blob/main/examples/models/llama/CMakeLists.txt#L95-L101 This results in runtime error: ``` + ./cmake-out/examples/models/llama/llama_main --model_path=model.pte --tokenizer_path=tokenizer.bin '--prompt=Once upon a time,' I tokenizers:regex.cpp:27] Registering override fallback regex E 00:00:00.000453 executorch:operator_registry.cpp:89] Re-registering aten::_cdist_forward.out, from NOT_SUPPORTED I 00:00:00.000494 executorch:operator_registry.cpp:90] key: (null), is_fallback: true F 00:00:00.000497 executorch:operator_registry.cpp:114] In function register_kernels(), assert failed (false): Kernel registration failed with error 18, see error log for details. ``` Like in this job: https://github.com/pytorch/executorch/actions/runs/16207706949/job/45761626062 In order to fix this, we change the logic of linking `${EXECUTORCH_LIBRARIES}` which includes both core and kernel ops lib into only linking `executorch_core`, because the kernel ops lib will be linked outside anyways.
Yep agreed. We should do that after we have CMake top level targets ready. |
Customers need the flags this sets (
--whole-archive
or-force_load
), otherwise the built operator libraries won't actually register operators.Testing:
I'm not sure that we have automated coverage for this file, so I'm sending a first version adding an intentional break.We have automated coverage for this file.