-
Notifications
You must be signed in to change notification settings - Fork 13.6k
Description
This relates to SynoCommunity spksrc project to build and package various open source software to run on Synology NAS.
I've been running into an issue where rust code cross-compiled for powerpc arch segfault (more specifically qoriq
). The exact same code builds perfectly fine for all other archs (armv5, v7, aarch64, x86_64, i686). Ref: SynoCommunity/spksrc#5847 SynoCommunity/spksrc#5684
I've took two different approaches:
- Using
RUSTFLAGS
to reproduce our CFLAGS used-mcpu=8548 -mhard-float -mfloat-gprs=double
SynoCommunity/spksrc@90ad41d - Building a tier 3
powerpc-unknown-linux-gnuspe
<<-- testing code only in my local branch for now
option 1
Using default powerpc-unknown-linux-gnu
along with RUSTFLAGS = -Ctarget-cpu=e500
lead to the exact same result, seftault at startup. I may not be using the right RUSTFLAGS?
option 2
As for option 2 I feel I'm digging my own hole as I'm unable to build a powerpc-unknown-linux-gnuspe
target using Synology provided toolchain and toolkit. I'm able to build up to stage1 & stage2 but unable to create a fully working target using cargo along with either my stage1 or stage2 builds... (clearly there is something I'm not fully understanding). Here's what I have so far:
git clone --depth 1 https://github.com/rust-lang/rust.git
./x setup compiler
PATH="$(WORK_DIR)/$(TC_TARGET)/bin:$${PATH}" ./x build --target $(RUST_TARGET)
rustup toolchain link powerpc-stage1 $(WORK_DIR)/rust/build/host/stage1
PATH="$(WORK_DIR)/$(TC_TARGET)/bin:$${PATH}" ./x build --stage 2 --target $(RUST_TARGET))
rustup toolchain link pwoerpc-stage2 $(WORK_DIR)/rust/build/host/stage2
--->> Up to this point all working OK <<---
Where it then fails:
rustup override set nightly
echo "[llvm]" >> $(WORK_DIR)/rust/config.toml
echo "allow-old-toolchain = true" >> $(WORK_DIR)/rust/config.toml <<-- Presuming as `powerpc-e500v2-linux-gnuspe-gcc` is v4.9.3
PATH="$(WORK_DIR)/$(TC_TARGET)/bin:$${PATH}" POWERPC_UNKNOWN_LINUX_GNUSPE_OPENSSL_DIR=$(WORK_DIR)/../../../toolkit/syno-$(ARCH)-$(TCVERSION)/work/usr/ RUST_BACKTRACE=full cargo +$(firstword $(subst -, ,$(RUST_TARGET)))-stage1 build -Zbuild-std=core,alloc --target powerpc-unknown-linux-gnuspe
The error relates to llvm (enven though it did built it succesfully during either stage1 or stage2):
error: failed to run custom build command for `rustc_llvm v0.0.0 (/home/spksrc/qoriq-debug/spksrc/toolchain/syno-qoriq-6.2.4/work/rust/compiler/rustc_llvm)`
Caused by:
process didn't exit successfully: `/home/spksrc/qoriq-debug/spksrc/toolchain/syno-qoriq-6.2.4/work/rust/target/debug/build/rustc_llvm-dc7ac9c0f6cecf54/build-script-build` (exit status: 101)
--- stdout
cargo:rustc-check-cfg=values(llvm_component,"ipo")
cargo:rustc-check-cfg=values(llvm_component,"bitreader")
cargo:rustc-check-cfg=values(llvm_component,"bitwriter")
cargo:rustc-check-cfg=values(llvm_component,"linker")
cargo:rustc-check-cfg=values(llvm_component,"asmparser")
cargo:rustc-check-cfg=values(llvm_component,"lto")
cargo:rustc-check-cfg=values(llvm_component,"coverage")
cargo:rustc-check-cfg=values(llvm_component,"instrumentation")
cargo:rustc-check-cfg=values(llvm_component,"x86")
cargo:rustc-check-cfg=values(llvm_component,"arm")
cargo:rustc-check-cfg=values(llvm_component,"aarch64")
cargo:rustc-check-cfg=values(llvm_component,"amdgpu")
cargo:rustc-check-cfg=values(llvm_component,"avr")
cargo:rustc-check-cfg=values(llvm_component,"loongarch")
cargo:rustc-check-cfg=values(llvm_component,"m68k")
cargo:rustc-check-cfg=values(llvm_component,"csky")
cargo:rustc-check-cfg=values(llvm_component,"mips")
cargo:rustc-check-cfg=values(llvm_component,"powerpc")
cargo:rustc-check-cfg=values(llvm_component,"systemz")
cargo:rustc-check-cfg=values(llvm_component,"jsbackend")
cargo:rustc-check-cfg=values(llvm_component,"webassembly")
cargo:rustc-check-cfg=values(llvm_component,"msp430")
cargo:rustc-check-cfg=values(llvm_component,"sparc")
cargo:rustc-check-cfg=values(llvm_component,"nvptx")
cargo:rustc-check-cfg=values(llvm_component,"hexagon")
cargo:rustc-check-cfg=values(llvm_component,"riscv")
cargo:rustc-check-cfg=values(llvm_component,"bpf")
cargo:rerun-if-env-changed=RUST_CHECK
cargo:rerun-if-env-changed=REAL_LIBRARY_PATH_VAR
--- stderr
thread 'main' panicked at compiler/rustc_llvm/build.rs:51:59:
REAL_LIBRARY_PATH_VAR
stack backtrace:
0: rust_begin_unwind
1: core::panicking::panic_fmt
2: core::option::expect_failed
3: core::option::Option<T>::expect
at /home/spksrc/qoriq-debug/spksrc/toolchain/syno-qoriq-6.2.4/work/rust/library/core/src/option.rs:888:21
4: build_script_build::restore_library_path
at ./build.rs:51:15
5: build_script_build::main
at ./build.rs:113:5
6: core::ops::function::FnOnce::call_once
at /home/spksrc/qoriq-debug/spksrc/toolchain/syno-qoriq-6.2.4/work/rust/library/core/src/ops/function.rs:250:5
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
warning: build failed, waiting for other jobs to finish...
make[2]: *** [../../mk/spksrc.tc-rust.mk:64: rustc_target] Error 101
/home/spksrc/qoriq-debug/spksrc/cross/bat/work-qoriq-6.2.4/tc_vars.mk:1: *** An error occured while setting up the toolchain, please check the messages above. Stop.
make[1]: Leaving directory '/home/spksrc/qoriq-debug/spksrc/cross/bat'
option 3
Now, further reading and as option 3 I may be able to "rebuild" tier 1 powerpc-unknown-linux-gnu
by adding the proper target features such as #117347 . But guessing I'd have to run through the same procession as option 2 ?