AGL:"bitbake agl-demo-platform" hangs in task 16 - yocto

I am an agl and pokey newbie, and have followed the steps in https://wiki.automotivelinux.org/agl-distro/source-code
(I am running the following in a docker container)
$source meta-agl/scripts/aglsetup.sh -m qemux86-64 agl-demo agl-netboot
------------ aglsetup.sh: Starting
Configuration files already exist:
- /home/work/agl/build/conf/local.conf
- /home/work/agl/build/conf/bblayers.conf
Skipping configuration files generation.
Use option -f|--force to overwrite existing configuration.
Generating setup manifest: /home/work/agl/build/aglsetup.manifest ... OK
Generating setup file: /home/work/agl/build/agl-init-build-env ... OK
------------ aglsetup.sh: Done
Common targets are:
- meta-agl: (core system)
- agl-profile-core:
agl-image-boot
agl-image-minimal
agl-image-minimal-qa
- agl-profile-graphical:
agl-image-weston
- agl-profile-graphical-qt5:
agl-image-graphical-qt5
agl-image-graphical-qt5-crosssdk
- agl-profile-graphical-html5
agl-demo-platform-html5
- meta-agl-demo: (demo with UI)
agl-image-ivi (base for ivi targets)
agl-image-ivi-qa
agl-image-ivi-crosssdk
agl-demo-platform (* default demo target)
agl-demo-platform-qa
agl-demo-platform-crosssdk
$bitbake agl-demo-platform
This hangs in
Initialising tasks: 100% |############################################################################################| Time: 0:00:05
Sstate summary: Wanted 2729 Found 0 Missed 2729 Current 0 (0% match, 0% complete)
NOTE: Executing SetScene Tasks
NOTE: Executing RunQueue Tasks
No currently running tasks (16 of 7400) 0% ||
In order to debug it, I ran
$bitbake -DDD agl-demo-platform
...
DEBUG: Full skip list {'/home/work/agl/meta-agl/meta-netboot/recipes-core/images/initramfs-netboot-image.bb:do_packagedata', '/home/work/agl/meta-agl/meta-netboot/recipes-core/images/initramfs-netboot-image.bb:do_install', '/home/work/agl/meta-agl-demo/recipes-platform/images/agl-demo-platform.bb:do$
package', '/home/work/agl/meta-agl-demo/recipes-platform/images/agl-demo-platform.bb:do_compile', '/home/work/agl/meta-agl-demo/recipes-platform/imag
es/agl-demo-platform.bb:do_install', '/home/work/agl/meta-agl-demo/recipes-platform/images/agl-demo-platform.bb:do_packagedata', '/home/work/agl/meta
-agl-demo/recipes-platform/images/agl-demo-platform.bb:do_configure', '/home/work/agl/meta-agl/meta-netboot/recipes-core/images/initramfs-netboot-image.bb:do_configure', '/home/work/agl/meta-agl/meta-netboot/recipes-core/images/initramfs-netboot-image.bb:do_compile', '/home/work/agl/meta-agl/meta-netboot/recipes-core/images/initramfs-netboot-image.bb:do_package'}
DEBUG: Using runqueue scheduler 'speed'
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/quilt-native/0.65-r0.do_fetch.e8a4c952a66942653e36f289eaf68ca5 not available
NOTE: Running task 1 of 7400 (/home/work/agl/external/poky/meta/recipes-devtools/quilt/quilt-native_0.65.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/texinfo-dummy-native/1.0-r0.do_fetch.6af0fac94be624020d4ded1391838faa not available
NOTE: Running task 2 of 7400 (/home/work/agl/external/poky/meta/recipes-extended/texinfo-dummy-native/texinfo-dummy-native.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/gnu-config-native/20180713+gitAUTOINC+30d53fc428-r0.do_fetch.66a4b9fc46062c0ab4c3d6bf6838$8ef not available
NOTE: Running task 3 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-devtools/gnu-config/gnu-config_git.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/m4-native/1.4.18-r0.do_fetch.6762cc3ab39f2cedf73b612115bd959d not available
NOTE: Running task 4 of 7400 (/home/work/agl/external/poky/meta/recipes-devtools/m4/m4-native_1.4.18.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/autoconf-native/2.69-r11.do_fetch.25fa26d4261bb5d4666677301aa59479 not available
NOTE: Running task 5 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-devtools/autoconf/autoconf_2.69.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/automake-native/1.16.1-r0.do_fetch.0fd4964b1b460fad47bd3cfb55e06e3f not available
NOTE: Running task 6 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-devtools/automake/automake_1.16.1.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/libtool-native/2.4.6-r0.do_fetch.fb99da9a9824dd7b876403694f7b783a not available
NOTE: Running task 7 of 7400 (/home/work/agl/external/poky/meta/recipes-devtools/libtool/libtool-native_2.4.6.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/gettext-minimal-native/0.19.8.1-r0.do_fetch.d984cddf39092f50c5874c27f42c9627 not available
NOTE: Running task 8 of 7400 (/home/work/agl/external/poky/meta/recipes-core/gettext/gettext-minimal-native_0.19.8.1.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/xz-native/5.2.4-r0.do_fetch.eb624201d02d0135b086909af9a87977 not available
NOTE: Running task 9 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-extended/xz/xz_5.2.4.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/gmp-native/6.1.2-r0.do_fetch.d4d7e5eb8e67d572386a46cc21e57f8e not available
NOTE: Running task 10 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-support/gmp/gmp_6.1.2.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/flex-native/2.6.0-r0.do_fetch.588daad6e54df2fe977b08ef749ef523 not available
NOTE: Running task 11 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-devtools/flex/flex_2.6.0.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/zlib-native/1.2.11-r0.do_fetch.1fa21ab74fd7fedd15f87baac65b9dab not available
NOTE: Running task 12 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-core/zlib/zlib_1.2.11.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/autoconf-archive-native/2018.03.13-r0.do_fetch.e880edd4650611bf6f65e254102ba230 not available
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/autoconf-archive-native/2018.03.13-r0.do_fetch.e880edd4650611bf6f65e254102ba230 not available
NOTE: Running task 13 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-devtools/autoconf-archive/autoconf-archive_2018.03.13.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/mpfr-native/4.0.1-r0.do_fetch.34c76de4a18ded6152d2ff68820420c9 not available
NOTE: Running task 14 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-support/mpfr/mpfr_4.0.1.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/bison-native/3.0.4-r0.do_fetch.53556f21491498d19bb9e3b24cf725b2 not available
NOTE: Running task 15 of 7400 (virtual:native:/home/work/agl/external/poky/meta/recipes-devtools/bison/bison_3.0.4.bb:do_fetch)
DEBUG: Stampfile /home/work/agl/build/tmp/stamps/x86_64-linux/binutils-cross-x86_64/2.31.1-r0.do_fetch.14df04f9e0c741b374c8987222b85026 not available
NOTE: Running task 16 of 7400 (/home/work/agl/external/poky/meta/recipes-devtools/binutils/binutils-cross_2.31.bb:do_fetch)
When the above happens, there are the following process in ps -ef output
admin 3977 1430 0 10:48 pts/3 00:00:02 python3 /home/work/agl/external/poky/bitbake/bin/bitbake agl-demo-platform
admin 3996 1 7 10:48 ? 00:00:28 python3 /home/work/agl/external/poky/bitbake/bin/bitbake agl-demo-platform
admin 4108 3996 0 10:48 ? 00:00:00 python3 /home/work/agl/external/poky/bitbake/bin/bitbake-worker decafbad
It looks like there are 16(?) do_fetch tasks going on. I have tried waiting for an hour but bitbake does not move forward.
My container does not have strace enabled. Could someone please help me with debugging?
All the git repositories under agl directory except the following three are on branch icefish, not sure if it matters but just documenting it
external/meta-iot-cloud
* (no branch)
external/meta-python2
* (no branch)
bsp/meta-arm
* (no branch)
There are no run.do_fetch logs in $T
admin#623c5e680b76:/home/work/agl/build$ bitbake -e|grep ^T=
T="/home/work/agl/build/tmp/work/corei7-64-agl-linux/defaultpkgname/1.0-r0/temp"
/home/work/agl$ ls -l build/tmp/work/corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/*
lrwxrwxrwx 1 admin admin 30 Jun 28 19:42 build/tmp/work/corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers -> run.oecore_update_bblayers.369
-rw-r--r-- 1 admin admin 4565 Jun 28 19:42 build/tmp/work/corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.369
-rw-rw-r-- 1 admin admin 4565 Jun 28 18:02 build/tmp/work/corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.560
-rw-r--r-- 1 admin admin 4565 Jun 28 17:50 build/tmp/work/corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.715
-rw-r--r-- 1 admin admin 4565 Jun 28 17:16 build/tmp/work/corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.769
EDIT
There is no quilt directory in the work directory
$ pwd
/home/work/agl/build/tmp/work
$ find .
.
./corei7-64-agl-linux
./corei7-64-agl-linux/defaultpkgname
./corei7-64-agl-linux/defaultpkgname/1.0-r0
./corei7-64-agl-linux/defaultpkgname/1.0-r0/temp
./corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.560
./corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.633
./corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.369
./corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.715
./corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers.769
./corei7-64-agl-linux/defaultpkgname/1.0-r0/temp/run.oecore_update_bblayers
EDIT
I could make the build start with basing my container off crops/poky-container.My container did not have the following
new user usersetup and sudoers.usersetup
execution of /usr/bin/distro-entry.sh which in turn runs /opt/poky/3.1/environment-setup-x86_64-pokysdk-linux

Related

Google Jib Core: Long time taken when a large subdirectory is used as a recursive layer

Environment:
Jib version: 0.22
Build tool: Gradle
OS: Linux
Description of the issue:
We are adding a large sub-directory as a layer (~234MB)
Jib seems to build this layer locally even though it is cached locally and this operation takes around 25 seconds.
2022-09-29 00:28:25,133 sn:] - Jib Builder Log [DEBUG]: Building layer built sha256:a5ca6eb638e9710574d074247f77b640c0b1d86942c5e0fb1deaf3452325a7cb
This process takes about 25 seconds, then it says that layer already exists
2022-09-29 00:28:25,456 sn:] - Jib Builder Log [INFO]: Skipping push; BLOB already exists on target registry : digest: sha256:a5ca6eb638e9710574d074247f77b640c0b1d86942c5e0fb1deaf3452325a7cb, size: 234630578
bash-4.2$ find /tmp/jib_image_cache/ -name a5ca6eb638e9710574d074247f77b640c0b1d86942c5e0fb1deaf3452325a7cb -exec ls -lrt {} \;
total 229132
-rw-r----- 1 tomcat tomcat 234630578 Sep 29 00:28 7161988df87ff3006f5803117549fb6d14109073edfcde7b2d0d549dc706ad94
bash-4.2$ find /tmp/jib_image_cache/ -name a5ca6eb638e9710574d074247f77b640c0b1d86942c5e0fb1deaf3452325a7cb -exec ls -lrtd {} \;
drwx------ 3 tomcat tomcat 90 Sep 29 06:56 /tmp/jib_image_cache/layers/a5ca6eb638e9710574d074247f77b640c0b1d86942c5e0fb1deaf3452325a7cb
Expected behavior:
Is there any way to speed up the process?
There are 680 files in 38 sub directories under main directory.
Is it a good idea to add each sub directory or file as a layer?

bitbake fails with permission denied errors

When I try to build any recipe, I am getting this error in a consistent way.
I have tried to build recipes from different layers and different sources and they all fail the same way. This tells me there is some fundamental setup issue.
What am I missing?
I am using External source code path to build and not using Git or tarballs.
Yocto version: Honister.
Initialising tasks: 100% |###################################################################################################################################################################| Time: 0:00:00
Sstate summary: Wanted 9 Local 0 Network 0 Missed 9 Current 133 (0% match, 93% complete)
NOTE: Executing Tasks
ERROR: test-1.0-r0 do_deploy_source_date_epoch: PermissionError(13, 'Permission denied')
ERROR: Logfile of failure stored in: /home/honister/build/tmp/work/cortexa72-cortexa53-poky-linux/test/1.0-r0/temp/log.do_deploy_source_date_epoch.674904
Log data follows:
| DEBUG: Executing python function sstate_task_prefunc
| DEBUG: Python function sstate_task_prefunc finished
| DEBUG: Executing python function create_source_date_epoch_stamp
| DEBUG: No tarball or git repo found to determine SOURCE_DATE_EPOCH
| DEBUG: Using SOURCE_DATE_EPOCH_FALLBACK
| DEBUG: SOURCE_DATE_EPOCH: 1302044400
| DEBUG: Python function create_source_date_epoch_stamp finished
ERROR: test-1.0-r0 do_populate_lic: PermissionError(13, 'Permission denied')
ERROR: Task (/home/honister/yocto/meta-test/recipes-apps/test.bb:do_deploy_source_date_epoch) failed with exit code '1'
ERROR: Logfile of failure stored in: /home/honister/build/tmp/work/cortexa72-cortexa53-poky-linux/test/1.0-r0/temp/log.do_populate_lic.674905
Log data follows:
| DEBUG: Executing python function sstate_task_prefunc
| DEBUG: Python function sstate_task_prefunc finished
ERROR: Task (/home/honister/yocto/meta-test/recipes-apps/test.bb:do_populate_lic) failed with exit code '1'

Getting error in baking Yocto project recipe OS

I'm trying to build a core-image-minimal recipe on Ubuntu 20.04 in VirtualBox for Raspberry Pi3 board (~2GB RAM, 100 GB storage allowed). But facing a new issue while baking the image. In the local.conf file I've removed "tar.xz ext3" from "IMAGE_FSTYPES = ""tar.xz ext3 rpi-sdimg"" then the image baked successfully without any error, but not supporting the I2C & UART. But when I reverted the above-mentioned change it's throwing errors.
ERROR:
core-image-minimal-1.0-r0 do_image_tar: Execution of '/home/pranav/poky/build/tmp/work/raspberrypi3-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/run.do_image_tar.2549' failed with exit code 1
Logfile of failure stored in: /home/pranav/poky/build/tmp/work/raspberrypi3-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/log.do_image_tar.2549
Please, let me know what is the solution.
Complete error :
pranav#Pranav:~$ cd poky
pranav#Pranav:~/poky$ source oe-init-build-env
### Shell environment set up for builds. ###
You can now run 'bitbake <target>'
Common targets are:
core-image-minimal
core-image-sato
meta-toolchain
meta-ide-support
You can also run generated qemu images with a command like 'runqemu qemux86'
Other commonly useful commands are:
- 'devtool' and 'recipetool' handle common recipe tasks
- 'bitbake-layers' handles common layer tasks
- 'oe-pkgdata-util' handles common target package tasks
pranav#Pranav:~/poky/build$ bitbake core-image-minimal
Loading cache: 100% |###################################################################################################| Time: 0:00:02
Loaded 3295 entries from dependency cache.
NOTE: Resolving any missing task queue dependencies
Build Configuration:
BB_VERSION = "1.46.0"
BUILD_SYS = "x86_64-linux"
NATIVELSBSTRING = "universal"
TARGET_SYS = "arm-poky-linux-gnueabi"
MACHINE = "raspberrypi3"
DISTRO = "poky"
DISTRO_VERSION = "3.1.14"
TUNE_FEATURES = "arm vfp cortexa7 neon vfpv4 thumb callconvention-hard"
TARGET_FPU = "hard"
meta
meta-poky
meta-yocto-bsp = "dunfell:3d5dd4dd8d66650615a01cd210ff101daa60c0df"
meta-raspberrypi = "dunfell:934064a01903b2ba9a82be93b3f0efdb4543a0e8"
meta-oe
meta-multimedia
meta-networking
meta-python = "dunfell:ec978232732edbdd875ac367b5a9c04b881f2e19"
Initialising tasks: 100% |##############################################################################################| Time: 0:00:10
Sstate summary: Wanted 2 Found 0 Missed 2 Current 1135 (0% match, 99% complete)
NOTE: Executing Tasks
ERROR: core-image-minimal-1.0-r0 do_image_tar: Execution of '/home/pranav/poky/build/tmp/work/raspberrypi3-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/run.do_image_tar.2549' failed with exit code 1
ERROR: Logfile of failure stored in: /home/pranav/poky/build/tmp/work/raspberrypi3-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/log.do_image_tar.2549
Log data follows:
| DEBUG: Executing python function set_image_size
| DEBUG: 8013.200000 = 6164 * 1.300000
| DEBUG: 8192.000000 = max(8013.200000, 8192)[8192.000000] + 0
| DEBUG: 8192.000000 = int(8192.000000)
| DEBUG: 8192 = aligned(8192)
| DEBUG: returning 8192
| DEBUG: Python function set_image_size finished
| DEBUG: Executing shell function do_image_tar
| xz: Memory usage limit is too low for the given filter setup.
| xz: 1,250 MiB of memory is required. The limit is 954 MiB.
| WARNING: exit code 1 from a shell command.
| ERROR: Execution of '/home/pranav/poky/build/tmp/work/raspberrypi3-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/run.do_image_tar.2549' failed with exit code 1
ERROR: Task (/home/pranav/poky/meta/recipes-core/images/core-image-minimal.bb:do_image_tar) failed with exit code '1'
NOTE: Tasks Summary: Attempted 3053 tasks of which 3051 didn't need to be rerun and 1 failed.
Summary: 1 task failed:
/home/pranav/poky/meta/recipes-core/images/core-image-minimal.bb:do_image_tar
Summary: There was 1 ERROR message shown, returning a non-zero exit code.

when using "NPM install " to my ionic app, it gives such error , which has almost no reference in stackoverflow or github. I have no clue, what to do

npm WARN tarball tarball data for igniteui-cli#4.2.4
(sha512-jVspYOn9TmzxIbfkpw6TEVbGJeZXLWHWJfQcpd9qcRXpvrnwMqBnBrP7x/CX01NiXlpYey1xxa5PMn3Eu+OYDw==)
seems to be corrupted. Trying one more time.
npm WARN tarball tarball data for igniteui-cli#4.2.4
(sha512-jVspYOn9TmzxIbfkpw6TEVbGJeZXLWHWJfQcpd9qcRXpvrnwMqBnBrP7x/CX01NiXlpYey1xxa5PMn3Eu+OYDw==)
seems to be corrupted. Trying one more time.
Getting this error like it is in a loop... Uhhh
Here is the debug log..
0 info it worked if it ends with ok 1 verbose cli [ 1 verbose cli
'C:\Program Files\nodejs\node.exe', 1 verbose cli
'C:\Users\Rakesh\AppData\Roaming\npm\node_modules\npm\bin\npm-cli.js',
1 verbose cli 'start' 1 verbose cli ] 2 info using npm#6.14.2 3 info
using node#v12.16.1 4 verbose run-script [ 'prestart', 'start',
'poststart' ] 5 info lifecycle privet#0.0.1~prestart: privet#0.0.1 6
info lifecycle privet#0.0.1~start: privet#0.0.1 7 verbose lifecycle
privet#0.0.1~start: unsafe-perm in lifecycle true 8 verbose lifecycle
privet#0.0.1~start: PATH:
C:\Users\Rakesh\AppData\Roaming\npm\node_modules\npm\node_modules\npm-lifecycle\node-gyp-bin;I:\Pri-Vet\PriVet_App\node_modules.bin;C:\Program
Files (x86)\Common
Files\Oracle\Java\javapath;C:\ProgramData\Oracle\Java\javapath;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0;C:\Program
Files\Git\cmd;C:\Program
Files\Git\bin;C:\xampp\php;C:\Gradle\gradle-4.6\bin;C:\composer;C:\Program
Files\dotnet;C:\Program Files\Microsoft SQL
Server\130\Tools\Binn;C:\Program
Files\Java\jdk1.8.0_40\bin;C:\Program Files\Microsoft VS
Code\bin;C:\Program
Files\MongoDB\Server\4.0\bin;;C:\WINDOWS\System32\OpenSSH;C:\Program
Files\nodejs;C:\Users\Rakesh.windows-build-tools\python27;C:\Program
Files\nodejs\node_modules\npm\node_modules\npm-lifecycle\node-gyp-bin;C:\Users\Rakesh\AppData\Roaming\npm\node_modules\windows-build-tools\node_modules.bin;C:\Users\Rakesh\AppData\Roaming\npm\node_modules.bin;C:\Program
Files (x86)\Common
Files\Oracle\Java\javapath;C:\ProgramData\Oracle\Java\javapath;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0;C:\Program
Files\Git\cmd;C:\Program
Files\Git\bin;C:\xampp\php;C:\Gradle\gradle-4.6\bin;C:\composer;C:\Program
Files\dotnet;C:\Program Files\Microsoft SQL
Server\130\Tools\Binn;C:\Program
Files\Java\jdk1.8.0_40\bin;C:\Program Files\Microsoft VS
Code\bin;C:\Program Files\nodejs;C:\Program
Files\MongoDB\Server\4.0\bin;C:\Users\Rakesh\AppData\Roaming\Composer\vendor\bin;C:\Users\Rakesh\AppData\Local\Microsoft\WindowsApps;C:\Users\Rakesh.dotnet\tools;C:\Users\Rakesh\AppData\Local\Microsoft\WindowsApps;C:\Users\Rakesh\AppData\Roaming\npm;C:\Users\Rakesh\App;C:\Users\Rakesh\AppData\Local\Programs\Microsoft
VS Code\bin 9 verbose lifecycle privet#0.0.1~start: CWD:
I:\Pri-Vet\PriVet_App 10 silly lifecycle privet#0.0.1~start: Args: [
'/d /s /c', 'ionic serve' ] 11 silly lifecycle privet#0.0.1~start:
Returned: code: 1 signal: null 12 info lifecycle privet#0.0.1~start:
Failed to exec start script 13 verbose stack Error: privet#0.0.1
start: ionic serve 13 verbose stack Exit status 1 13 verbose stack
at EventEmitter.
(C:\Users\Rakesh\AppData\Roaming\npm\node_modules\npm\node_modules\npm-lifecycle\index.js:332:16)
13 verbose stack at EventEmitter.emit (events.js:311:20) 13
verbose stack at ChildProcess.
(C:\Users\Rakesh\AppData\Roaming\npm\node_modules\npm\node_modules\npm-lifecycle\lib\spawn.js:55:14)
13 verbose stack at ChildProcess.emit (events.js:311:20) 13
verbose stack at maybeClose (internal/child_process.js:1021:16) 13
verbose stack at Process.ChildProcess._handle.onexit
(internal/child_process.js:286:5) 14 verbose pkgid privet#0.0.1 15
verbose cwd I:\Pri-Vet\PriVet_App 16 verbose Windows_NT 10.0.17134 17
verbose argv "C:\Program Files\nodejs\node.exe"
"C:\Users\Rakesh\AppData\Roaming\npm\node_modules\npm\bin\npm-cli.js"
"start" 18 verbose node v12.16.1 19 verbose npm v6.14.2 20 error code
ELIFECYCLE 21 error errno 1 22 error privet#0.0.1 start: ionic serve
22 error Exit status 1 23 error Failed at the privet#0.0.1 start
script. 23 error This is probably not a problem with npm. There is
likely additional logging output above. 24 verbose exit [ 1, true ]

Trying to create sample linux mage with yocto prject but cause building error

I tried to create a Linux image based on "yocto project mega Manuel".But I got an error in building image step.
I followed the mega Manuel.
I'm using Ubuntu 18.04.1 LTS.
Error:
aju#aju-HP-15-Notebook-PC:~/poky/build$ bitbake core-image-sato
WARNING: Host distribution "Ubuntu-18.04" has not been validated with this version of the build system; you may possibly experience
unexpected failures. It is recommended that you use a tested
distribution.
Parsing recipes: 100% |#########################################| Time: 00:00:49
Parsing of 899 .bb files complete (0 cached, 899 parsed). 1330 targets, 38 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
Build Configuration: BB_VERSION = "1.28.0" BUILD_SYS = "x86_64-linux" NATIVELSBSTRING = "Ubuntu-18.04" TARGET_SYS = "i586-poky-linux" MACHINE = "qemux86" DISTRO = "poky" DISTRO_VERSION = "2.0.3" TUNE_FEATURES = "m32 i586"
TARGET_FPU = "" meta meta-yocto
meta-yocto-bsp = "jethro:331275422b2c3f326f605c23ae89eedb4e222eb5"
NOTE: Preparing RunQueue NOTE: Executing SetScene Tasks NOTE:
Executing RunQueue Tasks ERROR: oe_runmake failed ERROR: Function
failed: do_compile (log file is located at
/home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/temp/log.do_compile.301)
ERROR: Logfile of failure stored in:
/home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/temp/log.do_compile.301
Log data follows: | DEBUG: Executing shell function do_compile | NOTE:
make -j 4 | : && /bin/mkdir -p doc && {
PATH='/home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/build/t/wrap:'$PATH
&& export PATH; } && /usr/bin/perl
/home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/automake-1.15/doc/help2man
--output=doc/automake-1.15.1 automake-1.15
| help2man: can't get --help' info from automake-1.15
| Try--no-discard-stderr' if option outputs to stderr
| Makefile:3687: recipe for target 'doc/automake-1.15.1' failed
| make: *** [doc/automake-1.15.1] Error 255
| WARNING: exit code 1 from a shell command.
| ERROR: oe_runmake failed
| ERROR: Function failed: do_compile (log file is located at /home/aju/poky/build/tmp/work/x86_64-linux/automake-native/1.15-r0/temp/log.do_compile.301)
ERROR: Task 403 (virtual:native:/home/aju/poky/meta/recipes-devtools/automake/automake_1.15.bb,
do_compile) failed with exit code '1'
NOTE: Tasks Summary: Attempted 73 tasks of which 53 didn't need to be rerun and 1 failed.
Waiting for 0 running tasks to finish:
Summary: 1 task failed: virtual:native:/home/aju/poky/meta/recipes-devtools/automake/automake_1.15.bb,
do_compile Summary: There was 1 WARNING message shown. Summary: There
were 2 ERROR messages shown, returning a non-zero exit code.
Is it the problem with this latest version or is it something else?
Why are you using such an old release of Yocto? 2.0.x was first released in 2015 and isn't supported on modern distributions. If you need to use 2.0.x then you can pick a patch from a recent release to fix autoconf, but I really do recommend using 2.5 (or 2.6, due to release any day now) instead.