Hi Lup,

Again, thanks for the amazing work on CI! I'm planning to provide a PC for
the build farm too.

I've read your articles about it, but it isn't clear how should I proceed
after uploading the gists to my github account. Should I run the scrapping
myself with https://github.com/lupyuen/ingest-nuttx-builds or do you take
control of it as soon as I provide the gists?

Am I missing something in the articles?

Best regards,

Em ter., 29 de out. de 2024 às 15:43, Alan C. Assis <acas...@gmail.com>
escreveu:

> Alin, maybe we need to add READONLY in the ldscript:
>
>
> https://stackoverflow.com/questions/73429929/gnu-linker-elf-has-a-load-segment-with-rwx-permissions-embedded-arm-project
>
> BR,
>
> Alan
>
> On Mon, Oct 28, 2024 at 5:42 AM Alin Jerpelea <jerpe...@gmail.com> wrote:
>
> > Hi all,
> >
> > is anyone aware of this warning ?
> > riscv-none-elf-ld: warning: /nuttx/nuttx has a LOAD segment with RWX
> > permissions
> >
> > Best regards
> > Alin
> >
> > On Mon, Oct 28, 2024 at 9:31 AM Alin Jerpelea <jerpe...@gmail.com>
> wrote:
> >
> > > Hi Lup,
> > >
> > > I think that we all should push the logs as they are on
> > > https://gist.github.com/nuttxpr in separate folders containing
> > > build target (ex: arm-01) with logs renamed : platform_buildtime.log
> > > or
> > > platform/board/config with logs renamed
> > > : platform_board_config_buildtime.log
> > >
> > > This should simplify the scripting and display
> > >
> > > what so you think ?
> > >
> > > On Mon, Oct 28, 2024 at 9:25 AM Lee, Lup Yuen <lu...@appkaki.com>
> wrote:
> > >
> > >> << the results from my test are available on
> > >> https://gist.github.com/jerpelea >>
> > >>
> > >> That's awesome Alin, thanks! :-)
> > >>
> > >> << I think that we should push all results on a git with date sorted
> by
> > >> platform /board then create a simple heatmap with the latest build and
> > >> green/red >>
> > >>
> > >> Yep lemme figure out if open-source Grafana can do this (with some
> > >> scripting): https://grafana.com/oss/grafana/
> > >>
> > >> Lup
> > >>
> > >> On Mon, Oct 28, 2024 at 4:22 PM Alin Jerpelea <jerpe...@gmail.com>
> > wrote:
> > >>
> > >> > HI all
> > >> > the results from my test are available on
> > >> https://gist.github.com/jerpelea
> > >> >
> > >> > I think that we should push all results on a git with date sorted by
> > >> > platform /board then create a simple heatmap with the latest build
> and
> > >> > green/red
> > >> > @lup what do you think ?
> > >> >
> > >> >
> > >> > Best regards
> > >> > Alin
> > >> >
> > >> > On Mon, Oct 28, 2024 at 9:15 AM Alin Jerpelea <jerpe...@gmail.com>
> > >> wrote:
> > >> >
> > >> > > Hi Lup,
> > >> > >
> > >> > > please add to the guide
> > >> > > "gh auth login" so that users can upload the results
> > >> > >
> > >> > > Best regards
> > >> > > Alin
> > >> > >
> > >> > >
> > >> > > On Mon, Oct 28, 2024 at 9:12 AM Lee, Lup Yuen <lu...@appkaki.com>
> > >> wrote:
> > >> > >
> > >> > >> << please add to the guide "apt install gh " on host os >>
> > >> > >>
> > >> > >> Yep thanks Alin! I have updated the article:
> > >> > >>
> > >> > >>
> > >> >
> > >>
> >
> https://lupyuen.codeberg.page/articles/ci2.html#build-nuttx-for-all-target-groups
> > >> > >>
> > >> > >> ## Download the scriptsgit clone
> > >> > >> https://github.com/lupyuen/nuttx-releasecd nuttx-release
> > >> > >> ## Login to GitHub in Headless Modesudo apt install ghsudo gh
> auth
> > >> login
> > >> > >> ## Run the Build Job forever: arm-01 ... arm-14sudo ./run-ci.sh
> > >> > >>
> > >> > >>
> > >> > >> Lup
> > >> > >>
> > >> > >> On Mon, Oct 28, 2024 at 4:05 PM Alin Jerpelea <
> jerpe...@gmail.com>
> > >> > wrote:
> > >> > >>
> > >> > >> > Hi Lup
> > >> > >> >
> > >> > >> > please add to the guide
> > >> > >> > "apt install gh "
> > >> > >> > on host os
> > >> > >> >
> > >> > >> > Best regards
> > >> > >> > Alin
> > >> > >> >
> > >> > >> > On Mon, Oct 28, 2024 at 8:50 AM Lee, Lup Yuen <
> lu...@appkaki.com
> > >
> > >> > >> wrote:
> > >> > >> >
> > >> > >> > > Thanks Alin, I think the fix is here:
> > >> > >> > > https://github.com/apache/nuttx/pull/14527
> > >> > >> > >
> > >> > >> > > Lup
> > >> > >> > >
> > >> > >> > > On Mon, Oct 28, 2024 at 3:43 PM Alin Jerpelea <
> > >> jerpe...@gmail.com>
> > >> > >> > wrote:
> > >> > >> > >
> > >> > >> > > > Cmake in present:
> > >> stm32f334-disco/nsh,CONFIG_ARM_TOOLCHAIN_CLANG
> > >> > >> > > > Configuration/Tool:
> > >> stm32f334-disco/nsh,CONFIG_ARM_TOOLCHAIN_CLANG
> > >> > >> > > > 2024-10-28 07:41:50
> > >> > >> > > >
> > >> > >> > > >
> > >> > >> > >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> ------------------------------------------------------------------------------------
> > >> > >> > > >   Cleaning...
> > >> > >> > > >   Configuring...
> > >> > >> > > > CMake Warning at cmake/nuttx_kconfig.cmake:171 (message):
> > >> > >> > > >   Kconfig Configuration Error: warning:
> > >> STM32_HAVE_HRTIM1_PLLCLK
> > >> > >> > (defined
> > >> > >> > > > at
> > >> > >> > > >   arch/arm/src/stm32/Kconfig:8109) has direct dependencies
> > >> > >> STM32_HRTIM
> > >> > >> > &&
> > >> > >> > > >   ARCH_CHIP_STM32 && ARCH_ARM with value n, but is
> currently
> > >> being
> > >> > >> > > > y-selected
> > >> > >> > > >   by the following symbols:
> > >> > >> > > >
> > >> > >> > > >    - STM32_STM32F33XX (defined at
> > >> > arch/arm/src/stm32/Kconfig:1533),
> > >> > >> > with
> > >> > >> > > > value y, direct dependencies ARCH_CHIP_STM32 && ARCH_ARM
> > >> (value:
> > >> > y),
> > >> > >> > and
> > >> > >> > > > select condition ARCH_CHIP_STM32 && ARCH_ARM (value: y)
> > >> > >> > > >
> > >> > >> > > > Call Stack (most recent call first):
> > >> > >> > > >   CMakeLists.txt:322 (nuttx_olddefconfig)
> > >> > >> > > >
> > >> > >> > > >
> > >> > >> > > >   Select HOST_LINUX=y
> > >> > >> > > > CMake Warning at cmake/nuttx_kconfig.cmake:192 (message):
> > >> > >> > > >   Kconfig Configuration Error: warning:
> > >> STM32_HAVE_HRTIM1_PLLCLK
> > >> > >> > (defined
> > >> > >> > > > at
> > >> > >> > > >   arch/arm/src/stm32/Kconfig:8109) has direct dependencies
> > >> > >> STM32_HRTIM
> > >> > >> > &&
> > >> > >> > > >   ARCH_CHIP_STM32 && ARCH_ARM with value n, but is
> currently
> > >> being
> > >> > >> > > > y-selected
> > >> > >> > > >   by the following symbols:
> > >> > >> > > >
> > >> > >> > > >    - STM32_STM32F33XX (defined at
> > >> > arch/arm/src/stm32/Kconfig:1533),
> > >> > >> > with
> > >> > >> > > > value y, direct dependencies ARCH_CHIP_STM32 && ARCH_ARM
> > >> (value:
> > >> > y),
> > >> > >> > and
> > >> > >> > > > select condition ARCH_CHIP_STM32 && ARCH_ARM (value: y)
> > >> > >> > > >
> > >> > >> > > > Call Stack (most recent call first):
> > >> > >> > > >   cmake/nuttx_sethost.cmake:107 (nuttx_setconfig)
> > >> > >> > > >   CMakeLists.txt:333 (nuttx_sethost)
> > >> > >> > > >
> > >> > >> > > >
> > >> > >> > > >   Disabling CONFIG_ARM_TOOLCHAIN_BUILDROOT
> > >> > >> > > >   Enabling CONFIG_ARM_TOOLCHAIN_CLANG
> > >> > >> > > >   Building NuttX...
> > >> > >> > > >
> > >> > >> > > > On Mon, Oct 28, 2024 at 8:18 AM Alin Jerpelea <
> > >> jerpe...@gmail.com
> > >> > >
> > >> > >> > > wrote:
> > >> > >> > > >
> > >> > >> > > > > HI Lup
> > >> > >> > > > > I found another one
> > >> > >> > > > >
> > >> > >> > > > > Cmake in present:
> > >> nucleo-f334r8/adc,CONFIG_ARM_TOOLCHAIN_CLANG
> > >> > >> > > > > Configuration/Tool:
> > >> nucleo-f334r8/adc,CONFIG_ARM_TOOLCHAIN_CLANG
> > >> > >> > > > > 2024-10-28 07:17:15
> > >> > >> > > > >
> > >> > >> > > > >
> > >> > >> > > >
> > >> > >> > >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> ------------------------------------------------------------------------------------
> > >> > >> > > > >   Cleaning...
> > >> > >> > > > >   Configuring...
> > >> > >> > > > > CMake Warning at cmake/nuttx_kconfig.cmake:171 (message):
> > >> > >> > > > >   Kconfig Configuration Error: warning:
> > >> STM32_HAVE_HRTIM1_PLLCLK
> > >> > >> > > (defined
> > >> > >> > > > > at
> > >> > >> > > > >   arch/arm/src/stm32/Kconfig:8109) has direct
> dependencies
> > >> > >> > STM32_HRTIM
> > >> > >> > > &&
> > >> > >> > > > >   ARCH_CHIP_STM32 && ARCH_ARM with value n, but is
> > currently
> > >> > being
> > >> > >> > > > > y-selected
> > >> > >> > > > >   by the following symbols:
> > >> > >> > > > >
> > >> > >> > > > >    - STM32_STM32F33XX (defined at
> > >> > >> arch/arm/src/stm32/Kconfig:1533),
> > >> > >> > > with
> > >> > >> > > > > value y, direct dependencies ARCH_CHIP_STM32 && ARCH_ARM
> > >> (value:
> > >> > >> y),
> > >> > >> > > and
> > >> > >> > > > > select condition ARCH_CHIP_STM32 && ARCH_ARM (value: y)
> > >> > >> > > > >
> > >> > >> > > > > Call Stack (most recent call first):
> > >> > >> > > > >   CMakeLists.txt:322 (nuttx_olddefconfig)
> > >> > >> > > > >
> > >> > >> > > > >
> > >> > >> > > > >   Select HOST_LINUX=y
> > >> > >> > > > > CMake Warning at cmake/nuttx_kconfig.cmake:192 (message):
> > >> > >> > > > >   Kconfig Configuration Error: warning:
> > >> STM32_HAVE_HRTIM1_PLLCLK
> > >> > >> > > (defined
> > >> > >> > > > > at
> > >> > >> > > > >   arch/arm/src/stm32/Kconfig:8109) has direct
> dependencies
> > >> > >> > STM32_HRTIM
> > >> > >> > > &&
> > >> > >> > > > >   ARCH_CHIP_STM32 && ARCH_ARM with value n, but is
> > currently
> > >> > being
> > >> > >> > > > > y-selected
> > >> > >> > > > >   by the following symbols:
> > >> > >> > > > >
> > >> > >> > > > >    - STM32_STM32F33XX (defined at
> > >> > >> arch/arm/src/stm32/Kconfig:1533),
> > >> > >> > > with
> > >> > >> > > > > value y, direct dependencies ARCH_CHIP_STM32 && ARCH_ARM
> > >> (value:
> > >> > >> y),
> > >> > >> > > and
> > >> > >> > > > > select condition ARCH_CHIP_STM32 && ARCH_ARM (value: y)
> > >> > >> > > > >
> > >> > >> > > > > Call Stack (most recent call first):
> > >> > >> > > > >   cmake/nuttx_sethost.cmake:107 (nuttx_setconfig)
> > >> > >> > > > >   CMakeLists.txt:333 (nuttx_sethost)
> > >> > >> > > > >
> > >> > >> > > > >
> > >> > >> > > > >   Disabling CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >   Enabling CONFIG_ARM_TOOLCHAIN_CLANG
> > >> > >> > > > >   Building NuttX...
> > >> > >> > > > >
> > >> > >> > > > > Thanks
> > >> > >> > > > > Alin
> > >> > >> > > > >
> > >> > >> > > > > On Mon, Oct 28, 2024 at 4:40 AM Lee, Lup Yuen <
> > >> > lu...@appkaki.com>
> > >> > >> > > wrote:
> > >> > >> > > > >
> > >> > >> > > > >> << needed on host machine (please update the article)
> > >> > >> > > > >> apt install gcc-arm-none-eabi binutils-arm-none-eabi
> > >> genromfs
> > >> > >>
> > >> > >> > > > >>
> > >> > >> > > > >> Hi Alin: This is super strange. genromfs isn't installed
> > on
> > >> my
> > >> > >> Host
> > >> > >> > > > >> Machine:
> > >> > >> > > > >>
> > >> > >> > > > >> ## genromfs isn't installed on my Host Machine
> > >> > >> > > > >> $ genromfs -h
> > >> > >> > > > >> Command 'genromfs' not found
> > >> > >> > > > >>
> > >> > >> > > > >> ## genromfs works fine inside Docker
> > >> > >> > > > >> $ sudo docker run -it \
> > >> > >> > > > >>   ghcr.io/apache/nuttx/apache-nuttx-ci-linux:latest \
> > >> > >> > > > >>   /bin/bash -c "genromfs -h"
> > >> > >> > > > >> genromfs 0.5.2
> > >> > >> > > > >>
> > >> > >> > > > >> Is anyone else having problems building NuttX with our
> > >> Docker
> > >> > >> Image?
> > >> > >> > > > >> Please
> > >> > >> > > > >> lemme know thanks!
> > >> > >> > > > >>
> > >> > >> > > > >> << /usr/bin/bash: line 1: arm-nuttx-eabi-gcc: command
> not
> > >> found
> > >> > >> >>
> > >> > >> > > > >>
> > >> > >> > > > >> This is a harmless message, we're tracking the issue
> here:
> > >> > >> > > > >> https://github.com/apache/nuttx/issues/14374
> > >> > >> > > > >>
> > >> > >> > > > >> Lup
> > >> > >> > > > >>
> > >> > >> > > > >> On Mon, Oct 28, 2024 at 11:23 AM Alin Jerpelea <
> > >> > >> jerpe...@gmail.com>
> > >> > >> > > > >> wrote:
> > >> > >> > > > >>
> > >> > >> > > > >> > Hi Lup,
> > >> > >> > > > >> >
> > >> > >> > > > >> > needed on host machine (please update the article)
> > >> > >> > > > >> >
> > >> > >> > > > >> > apt install gcc-arm-none-eabi binutils-arm-none-eabi
> > >> genromfs
> > >> > >> > > > >> >
> > >> > >> > > > >> > error still unidentified
> > >> > >> > > > >> >
> > >> > >> > > > >> > onfiguration/Tool:
> > >> > >> c5471evm/nettest,CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >> > 2024-10-28 04:20:28
> > >> > >> > > > >> >
> > >> > >> > > > >> >
> > >> > >> > > > >>
> > >> > >> > > >
> > >> > >> > >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> ------------------------------------------------------------------------------------
> > >> > >> > > > >> >   Cleaning...
> > >> > >> > > > >> >   Configuring...
> > >> > >> > > > >> >   Disabling CONFIG_ARM_TOOLCHAIN_BUILDROOT
> > >> > >> > > > >> >   Enabling CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >> >   Building NuttX...
> > >> > >> > > > >> >   Normalize c5471evm/nettest
> > >> > >> > > > >> > /usr/bin/bash: line 1: arm-nuttx-eabi-gcc: command not
> > >> found
> > >> > >> > > > >> > /usr/bin/bash: line 1: arm-nuttx-eabi-gcc: command not
> > >> found
> > >> > >> > > > >> >
> > >> > >> > > > >> > Best Regards
> > >> > >> > > > >> >
> > >> > >> > > > >> > Alin
> > >> > >> > > > >> >
> > >> > >> > > > >> >
> > >> > >> > > > >> > On Mon, Oct 28, 2024 at 3:25 AM Lee, Lup Yuen <
> > >> > >> lu...@appkaki.com>
> > >> > >> > > > >> wrote:
> > >> > >> > > > >> >
> > >> > >> > > > >> > > << /usr/bin/bash: line 1: genromfs: command not
> found
> > >>
> > >> > >> > > > >> > >
> > >> > >> > > > >> > > Hi Alin: That's very odd, genromfs is inside the
> > Docker
> > >> > >> Image so
> > >> > >> > > it
> > >> > >> > > > >> > > shouldn't fail (unless we're running outside
> Docker?).
> > >> > Here's
> > >> > >> > how
> > >> > >> > > we
> > >> > >> > > > >> > check
> > >> > >> > > > >> > > genromfs:
> > >> > >> > > > >> > >
> > >> > >> > > > >> > > ## This will show "genromfs 0.5.2"
> > >> > >> > > > >> > > sudo docker run -it \
> > >> > >> > > > >> > >   ghcr.io/apache/nuttx/apache-nuttx-ci-linux:latest
> \
> > >> > >> > > > >> > >   /bin/bash -c "genromfs -h"
> > >> > >> > > > >> > >
> > >> > >> > > > >> > > Also spresense:elf builds OK on my Ubuntu PC:
> > >> > >> > > > >> > >
> > >> > >> > > > >> > >
> > >> > >> > > > >> >
> > >> > >> > > > >>
> > >> > >> > > >
> > >> > >> > >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> https://gist.github.com/nuttxpr/8a203426383b84626c8a5bd06168bf9b#file-ci-arm-01-log-L359
> > >> > >> > > > >> > >
> > >> > >> > > > >> > > Could you try this (from my article) and lemme know
> if
> > >> it
> > >> > >> works?
> > >> > >> > > > >> Thanks!
> > >> > >> > > > >> > >
> > >> > >> > > > >> > > ## Compile Target Group arm-01, including
> > spresense/elf
> > >> > >> > > > >> > > job=arm-01
> > >> > >> > > > >> > > sudo docker run -it \
> > >> > >> > > > >> > >   ghcr.io/apache/nuttx/apache-nuttx-ci-linux:latest
> \
> > >> > >> > > > >> > >   /bin/bash -c "
> > >> > >> > > > >> > >   cd ;
> > >> > >> > > > >> > >   pwd ;
> > >> > >> > > > >> > >   git clone https://github.com/apache/nuttx ;
> > >> > >> > > > >> > >   git clone https://github.com/apache/nuttx-apps
> > apps ;
> > >> > >> > > > >> > >   pushd nuttx ; echo NuttX Source:
> > >> > >> > > > >> > > https://github.com/apache/nuttx/tree/\$(git
> rev-parse
> > >> > HEAD)
> > >> > >> ;
> > >> > >> > > popd
> > >> > >> > > > ;
> > >> > >> > > > >> > >   pushd apps  ; echo NuttX Apps:
> > >> > >> > > > >> > > https://github.com/apache/nuttx-apps/tree/\$(git
> > >> rev-parse
> > >> > >> > HEAD)
> > >> > >> > > ;
> > >> > >> > > > >> popd
> > >> > >> > > > >> > ;
> > >> > >> > > > >> > >   sleep 10 ;
> > >> > >> > > > >> > >   cd nuttx/tools/ci ;
> > >> > >> > > > >> > >   (./cibuild.sh -c -A -N -R testlist/$job.dat ||
> echo
> > >> > '*****
> > >> > >> > BUILD
> > >> > >> > > > >> > FAILED')
> > >> > >> > > > >> > > ;
> > >> > >> > > > >> > > "
> > >> > >> > > > >> > >
> > >> > >> > > > >> > > Lup
> > >> > >> > > > >> > >
> > >> > >> > > > >> > > On Mon, Oct 28, 2024 at 10:01 AM Alin Jerpelea <
> > >> > >> > > jerpe...@gmail.com>
> > >> > >> > > > >> > wrote:
> > >> > >> > > > >> > >
> > >> > >> > > > >> > > > Hi Lup
> > >> > >> > > > >> > > > I started the test and I found a fiew issues
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > > > Configuration/Tool:
> > >> > >> > spresense/elf,CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >> > > > 2024-10-28 02:49:20
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > >
> > >> > >> > > > >> >
> > >> > >> > > > >>
> > >> > >> > > >
> > >> > >> > >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> ------------------------------------------------------------------------------------
> > >> > >> > > > >> > > >   Cleaning...
> > >> > >> > > > >> > > >   Configuring...
> > >> > >> > > > >> > > >   Disabling CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >> > > >   Enabling CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >> > > >   Building NuttX...
> > >> > >> > > > >> > > > /usr/bin/bash: line 1: genromfs: command not found
> > >> > >> > > > >> > > > make[3]: *** [Makefile:81:
> > >> > >> > > > >> > > >
> > >> > /awork/nuttx/NuttX/farm/apps/examples/elf/tests/romfs.img]
> > >> > >> > Error
> > >> > >> > > > 127
> > >> > >> > > > >> > > > make[3]: Target 'all' not remade because of
> errors.
> > >> > >> > > > >> > > > make[2]: *** [Makefile:59: build] Error 2
> > >> > >> > > > >> > > > make[2]: Target 'all' not remade because of
> errors.
> > >> > >> > > > >> > > > make[1]: *** [Makefile:52:
> > >> > >> > > > >> > /awork/nuttx/NuttX/farm/apps/examples/elf_all]
> > >> > >> > > > >> > > > Error 2
> > >> > >> > > > >> > > > make[1]: Target 'all' not remade because of
> errors.
> > >> > >> > > > >> > > > make: *** [tools/LibTargets.mk:232:
> > >> > >> > > > >> > > /awork/nuttx/NuttX/farm/apps/libapps.a]
> > >> > >> > > > >> > > > Error 2
> > >> > >> > > > >> > > > make: Target 'all' not remade because of errors.
> > >> > >> > > > >> > > > /awork/nuttx/NuttX/farm/nuttx/tools/testbuild.sh:
> > line
> > >> > 385:
> > >> > >> > > > >> > > >
> > >> /awork/nuttx/NuttX/farm/nuttx/../nuttx/nuttx.manifest: No
> > >> > >> such
> > >> > >> > > > file
> > >> > >> > > > >> or
> > >> > >> > > > >> > > > directory
> > >> > >> > > > >> > > >   Normalize spresense/elf
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > > > Configuration/Tool:
> > >> > >> > > > >> spresense/posix_spawn,CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >> > > > 2024-10-28 02:51:17
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > >
> > >> > >> > > > >> >
> > >> > >> > > > >>
> > >> > >> > > >
> > >> > >> > >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> >
> ------------------------------------------------------------------------------------
> > >> > >> > > > >> > > >   Cleaning...
> > >> > >> > > > >> > > >   Configuring...
> > >> > >> > > > >> > > >   Disabling CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >> > > >   Enabling CONFIG_ARM_TOOLCHAIN_GNU_EABI
> > >> > >> > > > >> > > >   Building NuttX...
> > >> > >> > > > >> > > > /usr/bin/bash: line 1: genromfs: command not found
> > >> > >> > > > >> > > > make[3]: *** [Makefile:50:
> > >> > >> > > > >> > > >
> > >> > >> > > > >>
> > >> > >> > >
> > >> > >>
> > >>
> /awork/nuttx/NuttX/farm/apps/examples/posix_spawn/filesystem/romfs.img]
> > >> > >> > > > >> > > > Error 127
> > >> > >> > > > >> > > > make[3]: Target 'all' not remade because of
> errors.
> > >> > >> > > > >> > > > make[2]: *** [Makefile:47: build] Error 2
> > >> > >> > > > >> > > > make[2]: Target 'all' not remade because of
> errors.
> > >> > >> > > > >> > > > make[1]: *** [Makefile:52:
> > >> > >> > > > >> > > >
> > /awork/nuttx/NuttX/farm/apps/examples/posix_spawn_all]
> > >> > >> Error 2
> > >> > >> > > > >> > > > make[1]: Target 'all' not remade because of
> errors.
> > >> > >> > > > >> > > > make: *** [tools/LibTargets.mk:232:
> > >> > >> > > > >> > > /awork/nuttx/NuttX/farm/apps/libapps.a]
> > >> > >> > > > >> > > > Error 2
> > >> > >> > > > >> > > > make: Target 'all' not remade because of errors.
> > >> > >> > > > >> > > > /awork/nuttx/NuttX/farm/nuttx/tools/testbuild.sh:
> > line
> > >> > 385:
> > >> > >> > > > >> > > >
> > >> /awork/nuttx/NuttX/farm/nuttx/../nuttx/nuttx.manifest: No
> > >> > >> such
> > >> > >> > > > file
> > >> > >> > > > >> or
> > >> > >> > > > >> > > > directory
> > >> > >> > > > >> > > >   Normalize spresense/posix_spawn
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > > > Can you please take a look
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > > > Best regards
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > > > On Sun, Oct 27, 2024 at 11:47 PM Lee, Lup Yuen <
> > >> > >> > > lu...@appkaki.com
> > >> > >> > > > >
> > >> > >> > > > >> > > wrote:
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > > > > << 1) Regarding the script that uploads CI
> results
> > >> to
> > >> > >> github
> > >> > >> > > > >> gists:
> > >> > >> > > > >> > > will
> > >> > >> > > > >> > > > > this
> > >> > >> > > > >> > > > > work for anyone who runs the docker image? If
> not,
> > >> what
> > >> > >> > should
> > >> > >> > > > be
> > >> > >> > > > >> > done
> > >> > >> > > > >> > > > with
> > >> > >> > > > >> > > > > the results? >>
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > Thanks Nathan! I'm using GitHub Gists as a
> simple
> > >> way
> > >> > to
> > >> > >> > push
> > >> > >> > > > our
> > >> > >> > > > >> > Build
> > >> > >> > > > >> > > > > Logs to the cloud for further processing and
> > >> alerting.
> > >> > >> > (There
> > >> > >> > > > >> might
> > >> > >> > > > >> > be
> > >> > >> > > > >> > > a
> > >> > >> > > > >> > > > > better way)
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > I'm running my Build Server logged in as the
> > >> `nuttxpr`
> > >> > >> > GitHub
> > >> > >> > > > >> Account
> > >> > >> > > > >> > > > (via
> > >> > >> > > > >> > > > > `gh login auth`), so all Gists will be published
> > >> under
> > >> > >> the
> > >> > >> > > > >> `nuttxpr`
> > >> > >> > > > >> > > > > account. Someone who runs the Docker Image will
> > >> > probably
> > >> > >> > > create
> > >> > >> > > > a
> > >> > >> > > > >> new
> > >> > >> > > > >> > > > > GitHub Account to publish the Gists:
> > >> > >> > > > >> https://gist.github.com/nuttxpr
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > What we need next: A script that will (1)
> > >> Consolidate
> > >> > the
> > >> > >> > > Gists
> > >> > >> > > > >> > > > > across Multiple GitHub Accounts (2) Scan the
> Build
> > >> Logs
> > >> > >> for
> > >> > >> > > > Errors
> > >> > >> > > > >> > and
> > >> > >> > > > >> > > > > Warnings (3) Alert somebody.
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > << 2) Is there a way to detect (like a GPIO
> rising
> > >> or
> > >> > >> > falling
> > >> > >> > > > >> edge,
> > >> > >> > > > >> > for
> > >> > >> > > > >> > > > > lack
> > >> > >> > > > >> > > > > of a better description) that a build that
> > >> previously
> > >> > >> > > succeeded
> > >> > >> > > > is
> > >> > >> > > > >> > > > failing,
> > >> > >> > > > >> > > > > or a build that was previously failing succeeds
> > >> again,
> > >> > to
> > >> > >> > > notify
> > >> > >> > > > >> only
> > >> > >> > > > >> > > > about
> > >> > >> > > > >> > > > > targets that change status? >>
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > We have a problem: There doesn't seem to be an
> > easy
> > >> to
> > >> > >> scan
> > >> > >> > > our
> > >> > >> > > > >> Build
> > >> > >> > > > >> > > > Logs
> > >> > >> > > > >> > > > > for Errors and Warnings:
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > >
> > >> > >> > > > >>
> > >> > >> > > >
> > >> > >> >
> > >> > >>
> > >> >
> > >>
> > https://lupyuen.codeberg.page/articles/ci2.html#find-errors-and-warnings
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > We should implement this Log Scanning in the
> > script
> > >> > that
> > >> > >> I
> > >> > >> > > > >> mentioned
> > >> > >> > > > >> > > > > earlier. Then we can detect Failed Builds and
> > alert
> > >> > >> > somebody.
> > >> > >> > > > >> Maybe
> > >> > >> > > > >> > > > through
> > >> > >> > > > >> > > > > open-source Prometheus + Grafana:
> > >> > >> > > > >> > > > > https://lupyuen.github.io/articles/prometheus
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > << 3) Regarding M1 macs not being able to run
> the
> > CI
> > >> > >> > builds, I
> > >> > >> > > > >> > suggest
> > >> > >> > > > >> > > to
> > >> > >> > > > >> > > > > state that as a call to action, e.g., help
> wanted
> > to
> > >> > make
> > >> > >> > the
> > >> > >> > > > >> build
> > >> > >> > > > >> > > > succeed
> > >> > >> > > > >> > > > > on M1 macs, please see such-and-such issue on
> > >> github...
> > >> > >> >>
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > Yep I'll post an Issue at our NuttX Repo. I'll
> > >> explain
> > >> > >> what
> > >> > >> > > > >> happens
> > >> > >> > > > >> > > when
> > >> > >> > > > >> > > > I
> > >> > >> > > > >> > > > > run the CI Build on my M2 Mac. Thanks!
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > Lup
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > On Mon, Oct 28, 2024 at 12:15 AM Nathan Hartman
> <
> > >> > >> > > > >> > > > hartman.nat...@gmail.com>
> > >> > >> > > > >> > > > > wrote:
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > > > > Nice article Lup! Thank you. A few questions:
> > >> > >> > > > >> > > > > >
> > >> > >> > > > >> > > > > > 1) Regarding the script that uploads CI
> results
> > to
> > >> > >> github
> > >> > >> > > > gists:
> > >> > >> > > > >> > will
> > >> > >> > > > >> > > > > this
> > >> > >> > > > >> > > > > > work for anyone who runs the docker image? If
> > not,
> > >> > what
> > >> > >> > > should
> > >> > >> > > > >> be
> > >> > >> > > > >> > > done
> > >> > >> > > > >> > > > > with
> > >> > >> > > > >> > > > > > the results?
> > >> > >> > > > >> > > > > >
> > >> > >> > > > >> > > > > > 2) Is there a way to detect (like a GPIO
> rising
> > or
> > >> > >> falling
> > >> > >> > > > edge,
> > >> > >> > > > >> > for
> > >> > >> > > > >> > > > lack
> > >> > >> > > > >> > > > > > of a better description) that a build that
> > >> previously
> > >> > >> > > > succeeded
> > >> > >> > > > >> is
> > >> > >> > > > >> > > > > failing,
> > >> > >> > > > >> > > > > > or a build that was previously failing
> succeeds
> > >> > again,
> > >> > >> to
> > >> > >> > > > notify
> > >> > >> > > > >> > only
> > >> > >> > > > >> > > > > about
> > >> > >> > > > >> > > > > > targets that change status?
> > >> > >> > > > >> > > > > >
> > >> > >> > > > >> > > > > > 3) Regarding M1 macs not being able to run the
> > CI
> > >> > >> builds,
> > >> > >> > I
> > >> > >> > > > >> suggest
> > >> > >> > > > >> > > to
> > >> > >> > > > >> > > > > > state that as a call to action, e.g., help
> > wanted
> > >> to
> > >> > >> make
> > >> > >> > > the
> > >> > >> > > > >> build
> > >> > >> > > > >> > > > > succeed
> > >> > >> > > > >> > > > > > on M1 macs, please see such-and-such issue on
> > >> > github...
> > >> > >> > > > >> > > > > >
> > >> > >> > > > >> > > > > > Otherwise looks good and thanks again!
> > >> > >> > > > >> > > > > >
> > >> > >> > > > >> > > > > > Cheers,
> > >> > >> > > > >> > > > > > Nathan
> > >> > >> > > > >> > > > > >
> > >> > >> > > > >> > > > > > On Sat, Oct 26, 2024 at 6:08 PM Lee, Lup Yuen
> <
> > >> > >> > > > >> lu...@appkaki.com>
> > >> > >> > > > >> > > > wrote:
> > >> > >> > > > >> > > > > >
> > >> > >> > > > >> > > > > > > Refurbished Ubuntu PCs have become quite
> > >> > affordable.
> > >> > >> Can
> > >> > >> > > we
> > >> > >> > > > >> turn
> > >> > >> > > > >> > > them
> > >> > >> > > > >> > > > > > into
> > >> > >> > > > >> > > > > > > a (Low-Cost) Build Farm for NuttX?
> > >> > >> > > > >> > > > > > >
> > >> > >> > > > >> > > > > > > In this article we…
> > >> > >> > > > >> > > > > > > (1) Compile NuttX for a group of Arm32
> Boards
> > >> > >> > > > >> > > > > > > (2) Then scale up and compile NuttX for All
> > >> Arm32
> > >> > >> Boards
> > >> > >> > > > >> > > > > > > (3) Thanks to the Docker Image provided by
> > NuttX
> > >> > >> > > > >> > > > > > > (4) Why do this? Because GitHub Actions
> taught
> > >> us a
> > >> > >> > > Painful
> > >> > >> > > > >> > Lesson:
> > >> > >> > > > >> > > > > > > Freebies Won’t Last Forever!
> > >> > >> > > > >> > > > > > >
> > >> > >> > > > >> > > > > > > Check out the article:
> > >> > >> > > > >> > > > https://lupyuen.codeberg.page/articles/ci2.html
> > >> > >> > > > >> > > > > > >
> > >> > >> > > > >> > > > > > > Lup
> > >> > >> > > > >> > > > > > >
> > >> > >> > > > >> > > > > >
> > >> > >> > > > >> > > > >
> > >> > >> > > > >> > > >
> > >> > >> > > > >> > >
> > >> > >> > > > >> >
> > >> > >> > > > >>
> > >> > >> > > > >
> > >> > >> > > >
> > >> > >> > >
> > >> > >> >
> > >> > >>
> > >> > >
> > >> >
> > >>
> > >
> >
>

Reply via email to