public inbox for isar-users@googlegroups.com
 help / color / mirror / Atom feed
From: Anton Mikanovich <amikan@ilbers.de>
To: isar-users@googlegroups.com
Cc: Anton Mikanovich <amikan@ilbers.de>
Subject: [PATCH v8 14/20] meta: align with OE-core libraries update
Date: Wed, 25 Jan 2023 21:23:31 +0200	[thread overview]
Message-ID: <20230125192337.86869-15-amikan@ilbers.de> (raw)
In-Reply-To: <20230125192337.86869-1-amikan@ilbers.de>

Based on v4.0.5 commit fbdf93f43ff4b876487e1f26752598ec8abcb46e.

Signed-off-by: Anton Mikanovich <amikan@ilbers.de>
---
 meta/lib/oe/gpg_sign.py  |  35 +++++++++--
 meta/lib/oe/maketype.py  |   7 +--
 meta/lib/oe/patch.py     |  52 +++++++++++-----
 meta/lib/oe/path.py      |  79 ++++++++++++++++++++++++
 meta/lib/oe/sstatesig.py | 129 ++++++++++++++++++++++++++++-----------
 meta/lib/oe/terminal.py  |  33 +++++++---
 meta/lib/oe/utils.py     |  67 ++++++++++++--------
 7 files changed, 304 insertions(+), 98 deletions(-)

diff --git a/meta/lib/oe/gpg_sign.py b/meta/lib/oe/gpg_sign.py
index 492f096..6e35f3b 100644
--- a/meta/lib/oe/gpg_sign.py
+++ b/meta/lib/oe/gpg_sign.py
@@ -1,4 +1,6 @@
 #
+# Imported from openembedded-core
+#
 # SPDX-License-Identifier: GPL-2.0-only
 #
 
@@ -58,7 +60,7 @@ class LocalSigner(object):
         for i in range(0, len(files), sign_chunk):
             subprocess.check_output(shlex.split(cmd + ' '.join(files[i:i+sign_chunk])), stderr=subprocess.STDOUT)
 
-    def detach_sign(self, input_file, keyid, passphrase_file, passphrase=None, armor=True):
+    def detach_sign(self, input_file, keyid, passphrase_file, passphrase=None, armor=True, output_suffix=None, use_sha256=False):
         """Create a detached signature of a file"""
 
         if passphrase_file and passphrase:
@@ -71,6 +73,10 @@ class LocalSigner(object):
             cmd += ['--homedir', self.gpg_path]
         if armor:
             cmd += ['--armor']
+        if output_suffix:
+            cmd += ['-o', input_file + "." + output_suffix]
+        if use_sha256:
+            cmd += ['--digest-algo', "SHA256"]
 
         #gpg > 2.1 supports password pipes only through the loopback interface
         #gpg < 2.1 errors out if given unknown parameters
@@ -109,16 +115,33 @@ class LocalSigner(object):
             bb.fatal("Could not get gpg version: %s" % e)
 
 
-    def verify(self, sig_file):
+    def verify(self, sig_file, valid_sigs = ''):
         """Verify signature"""
-        cmd = self.gpg_cmd + ["--verify", "--no-permission-warning"]
+        cmd = self.gpg_cmd + ["--verify", "--no-permission-warning", "--status-fd", "1"]
         if self.gpg_path:
             cmd += ["--homedir", self.gpg_path]
 
         cmd += [sig_file]
-        status = subprocess.call(cmd)
-        ret = False if status else True
-        return ret
+        status = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+        # Valid if any key matches if unspecified
+        if not valid_sigs:
+            ret = False if status.returncode else True
+            return ret
+
+        import re
+        goodsigs = []
+        sigre = re.compile(r'^\[GNUPG:\] GOODSIG (\S+)\s(.*)$')
+        for l in status.stdout.decode("utf-8").splitlines():
+            s = sigre.match(l)
+            if s:
+                goodsigs += [s.group(1)]
+
+        for sig in valid_sigs.split():
+            if sig in goodsigs:
+                return True
+        if len(goodsigs):
+            bb.warn('No accepted signatures found. Good signatures found: %s.' % ' '.join(goodsigs))
+        return False
 
 
 def get_signer(d, backend):
diff --git a/meta/lib/oe/maketype.py b/meta/lib/oe/maketype.py
index 969a22b..a9a1dd7 100644
--- a/meta/lib/oe/maketype.py
+++ b/meta/lib/oe/maketype.py
@@ -12,12 +12,7 @@ the arguments of the type's factory for details.
 
 import inspect
 import oe.types as types
-try:
-    # Python 3.7+
-    from collections.abc import Callable
-except ImportError:
-    # Python < 3.7
-    from collections import Callable
+from collections.abc import Callable
 
 available_types = {}
 
diff --git a/meta/lib/oe/patch.py b/meta/lib/oe/patch.py
index 48c7141..f6cd934 100644
--- a/meta/lib/oe/patch.py
+++ b/meta/lib/oe/patch.py
@@ -6,6 +6,7 @@
 
 import oe.path
 import oe.types
+import subprocess
 
 class NotFoundError(bb.BBHandledException):
     def __init__(self, path):
@@ -27,7 +28,6 @@ class CmdError(bb.BBHandledException):
 
 def runcmd(args, dir = None):
     import pipes
-    import subprocess
 
     if dir:
         olddir = os.path.abspath(os.curdir)
@@ -40,20 +40,25 @@ def runcmd(args, dir = None):
         args = [ pipes.quote(str(arg)) for arg in args ]
         cmd = " ".join(args)
         # print("cmd: %s" % cmd)
-        (exitstatus, output) = subprocess.getstatusoutput(cmd)
+        proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        stdout, stderr = proc.communicate()
+        stdout = stdout.decode('utf-8')
+        stderr = stderr.decode('utf-8')
+        exitstatus = proc.returncode
         if exitstatus != 0:
-            raise CmdError(cmd, exitstatus >> 8, output)
-        if " fuzz " in output:
+            raise CmdError(cmd, exitstatus >> 8, "stdout: %s\nstderr: %s" % (stdout, stderr))
+        if " fuzz " in stdout and "Hunk " in stdout:
             # Drop patch fuzz info with header and footer to log file so
             # insane.bbclass can handle to throw error/warning
-            bb.note("--- Patch fuzz start ---\n%s\n--- Patch fuzz end ---" % format(output))
+            bb.note("--- Patch fuzz start ---\n%s\n--- Patch fuzz end ---" % format(stdout))
 
-        return output
+        return stdout
 
     finally:
         if dir:
             os.chdir(olddir)
 
+
 class PatchError(Exception):
     def __init__(self, msg):
         self.msg = msg
@@ -296,6 +301,24 @@ class GitApplyTree(PatchTree):
         PatchTree.__init__(self, dir, d)
         self.commituser = d.getVar('PATCH_GIT_USER_NAME')
         self.commitemail = d.getVar('PATCH_GIT_USER_EMAIL')
+        if not self._isInitialized(d):
+            self._initRepo()
+
+    def _isInitialized(self, d):
+        cmd = "git rev-parse --show-toplevel"
+        try:
+            output = runcmd(cmd.split(), self.dir).strip()
+        except CmdError as err:
+            ## runcmd returned non-zero which most likely means 128
+            ## Not a git directory
+            return False
+        ## Make sure repo is in builddir to not break top-level git repos, or under workdir
+        return os.path.samefile(output, self.dir) or oe.path.is_path_parent(d.getVar('WORKDIR'), output)
+
+    def _initRepo(self):
+        runcmd("git init".split(), self.dir)
+        runcmd("git add .".split(), self.dir)
+        runcmd("git commit -a --allow-empty -m bitbake_patching_started".split(), self.dir)
 
     @staticmethod
     def extractPatchHeader(patchfile):
@@ -418,7 +441,7 @@ class GitApplyTree(PatchTree):
                     date = newdate
                 if not subject:
                     subject = newsubject
-        if subject and outlines and not outlines[0].strip() == subject:
+        if subject and not (outlines and outlines[0].strip() == subject):
             outlines.insert(0, '%s\n\n' % subject.strip())
 
         # Write out commit message to a file
@@ -441,7 +464,6 @@ class GitApplyTree(PatchTree):
     def extractPatches(tree, startcommit, outdir, paths=None):
         import tempfile
         import shutil
-        import re
         tempdir = tempfile.mkdtemp(prefix='oepatch')
         try:
             shellcmd = ["git", "format-patch", "--no-signature", "--no-numbered", startcommit, "-o", tempdir]
@@ -457,13 +479,10 @@ class GitApplyTree(PatchTree):
                         try:
                             with open(srcfile, 'r', encoding=encoding) as f:
                                 for line in f:
-                                    checkline = line
-                                    if checkline.startswith('Subject: '):
-                                        checkline = re.sub(r'\[.+?\]\s*', '', checkline[9:])
-                                    if checkline.startswith(GitApplyTree.patch_line_prefix):
+                                    if line.startswith(GitApplyTree.patch_line_prefix):
                                         outfile = line.split()[-1].strip()
                                         continue
-                                    if checkline.startswith(GitApplyTree.ignore_commit_prefix):
+                                    if line.startswith(GitApplyTree.ignore_commit_prefix):
                                         continue
                                     patchlines.append(line)
                         except UnicodeDecodeError:
@@ -510,8 +529,7 @@ class GitApplyTree(PatchTree):
         with open(commithook, 'w') as f:
             # NOTE: the formatting here is significant; if you change it you'll also need to
             # change other places which read it back
-            f.write('echo >> $1\n')
-            f.write('echo "%s: $PATCHFILE" >> $1\n' % GitApplyTree.patch_line_prefix)
+            f.write('echo "\n%s: $PATCHFILE" >> $1' % GitApplyTree.patch_line_prefix)
         os.chmod(commithook, 0o755)
         shutil.copy2(commithook, applyhook)
         try:
@@ -519,7 +537,7 @@ class GitApplyTree(PatchTree):
             try:
                 shellcmd = [patchfilevar, "git", "--work-tree=%s" % reporoot]
                 self.gitCommandUserOptions(shellcmd, self.commituser, self.commitemail)
-                shellcmd += ["am", "-3", "--keep-cr", "-p%s" % patch['strippath']]
+                shellcmd += ["am", "-3", "--keep-cr", "--no-scissors", "-p%s" % patch['strippath']]
                 return _applypatchhelper(shellcmd, patch, force, reverse, run)
             except CmdError:
                 # Need to abort the git am, or we'll still be within it at the end
@@ -582,6 +600,8 @@ class QuiltTree(PatchSet):
 
     def Clean(self):
         try:
+            # make sure that patches/series file exists before quilt pop to keep quilt-0.67 happy
+            open(os.path.join(self.dir, "patches","series"), 'a').close()
             self._runcmd(["pop", "-a", "-f"])
             oe.path.remove(os.path.join(self.dir, "patches","series"))
         except Exception:
diff --git a/meta/lib/oe/path.py b/meta/lib/oe/path.py
index 3506e2c..348feeb 100644
--- a/meta/lib/oe/path.py
+++ b/meta/lib/oe/path.py
@@ -1,4 +1,6 @@
 #
+# Imported from openembedded-core
+#
 # SPDX-License-Identifier: GPL-2.0-only
 #
 
@@ -264,3 +266,80 @@ def realpath(file, root, use_physdir = True, loop_cnt = 100, assume_dir = False)
         raise
 
     return file
+
+def is_path_parent(possible_parent, *paths):
+    """
+    Return True if a path is the parent of another, False otherwise.
+    Multiple paths to test can be specified in which case all
+    specified test paths must be under the parent in order to
+    return True.
+    """
+    def abs_path_trailing(pth):
+        pth_abs = os.path.abspath(pth)
+        if not pth_abs.endswith(os.sep):
+            pth_abs += os.sep
+        return pth_abs
+
+    possible_parent_abs = abs_path_trailing(possible_parent)
+    if not paths:
+        return False
+    for path in paths:
+        path_abs = abs_path_trailing(path)
+        if not path_abs.startswith(possible_parent_abs):
+            return False
+    return True
+
+def which_wild(pathname, path=None, mode=os.F_OK, *, reverse=False, candidates=False):
+    """Search a search path for pathname, supporting wildcards.
+
+    Return all paths in the specific search path matching the wildcard pattern
+    in pathname, returning only the first encountered for each file. If
+    candidates is True, information on all potential candidate paths are
+    included.
+    """
+    paths = (path or os.environ.get('PATH', os.defpath)).split(':')
+    if reverse:
+        paths.reverse()
+
+    seen, files = set(), []
+    for index, element in enumerate(paths):
+        if not os.path.isabs(element):
+            element = os.path.abspath(element)
+
+        candidate = os.path.join(element, pathname)
+        globbed = glob.glob(candidate)
+        if globbed:
+            for found_path in sorted(globbed):
+                if not os.access(found_path, mode):
+                    continue
+                rel = os.path.relpath(found_path, element)
+                if rel not in seen:
+                    seen.add(rel)
+                    if candidates:
+                        files.append((found_path, [os.path.join(p, rel) for p in paths[:index+1]]))
+                    else:
+                        files.append(found_path)
+
+    return files
+
+def canonicalize(paths, sep=','):
+    """Given a string with paths (separated by commas by default), expand
+    each path using os.path.realpath() and return the resulting paths as a
+    string (separated using the same separator a the original string).
+    """
+    # Ignore paths containing "$" as they are assumed to be unexpanded bitbake
+    # variables. Normally they would be ignored, e.g., when passing the paths
+    # through the shell they would expand to empty strings. However, when they
+    # are passed through os.path.realpath(), it will cause them to be prefixed
+    # with the absolute path to the current directory and thus not be empty
+    # anymore.
+    #
+    # Also maintain trailing slashes, as the paths may actually be used as
+    # prefixes in sting compares later on, where the slashes then are important.
+    canonical_paths = []
+    for path in (paths or '').split(sep):
+        if '$' not in path:
+            trailing_slash = path.endswith('/') and '/' or ''
+            canonical_paths.append(os.path.realpath(path) + trailing_slash)
+
+    return sep.join(canonical_paths)
diff --git a/meta/lib/oe/sstatesig.py b/meta/lib/oe/sstatesig.py
index 71a74fb..acd47a0 100644
--- a/meta/lib/oe/sstatesig.py
+++ b/meta/lib/oe/sstatesig.py
@@ -1,4 +1,6 @@
 #
+# Imported from openembedded-core
+#
 # SPDX-License-Identifier: GPL-2.0-only
 #
 import bb.siggen
@@ -24,10 +26,19 @@ def sstate_rundepfilter(siggen, fn, recipename, task, dep, depname, dataCaches):
         return "/allarch.bbclass" in inherits
     def isImage(mc, fn):
         return "/image.bbclass" in " ".join(dataCaches[mc].inherits[fn])
+    def isSPDXTask(task):
+        return task in ("do_create_spdx", "do_create_runtime_spdx")
 
     depmc, _, deptaskname, depmcfn = bb.runqueue.split_tid_mcfn(dep)
     mc, _ = bb.runqueue.split_mc(fn)
 
+    # Keep all dependencies between SPDX tasks in the signature. SPDX documents
+    # are linked together by hashes, which means if a dependent document changes,
+    # all downstream documents must be re-written (even if they are "safe"
+    # dependencies).
+    if isSPDXTask(task) and isSPDXTask(deptaskname):
+        return True
+
     # (Almost) always include our own inter-task dependencies (unless it comes
     # from a mcdepends). The exception is the special
     # do_kernel_configme->do_unpack_and_patch dependency from archiver.bbclass.
@@ -108,7 +119,6 @@ class SignatureGeneratorOEBasicHashMixIn(object):
         self.unlockedrecipes = (data.getVar("SIGGEN_UNLOCKED_RECIPES") or
                                 "").split()
         self.unlockedrecipes = { k: "" for k in self.unlockedrecipes }
-        self.buildarch = data.getVar('BUILD_ARCH')
         self._internal = False
         pass
 
@@ -147,13 +157,6 @@ class SignatureGeneratorOEBasicHashMixIn(object):
         self.dump_lockedsigs(sigfile)
         return super(bb.siggen.SignatureGeneratorBasicHash, self).dump_sigs(dataCache, options)
 
-    def prep_taskhash(self, tid, deps, dataCaches):
-        super().prep_taskhash(tid, deps, dataCaches)
-        if hasattr(self, "extramethod"):
-            (mc, _, _, fn) = bb.runqueue.split_tid_mcfn(tid)
-            inherits = " ".join(dataCaches[mc].inherits[fn])
-            if inherits.find("/native.bbclass") != -1 or inherits.find("/cross.bbclass") != -1:
-                self.extramethod[tid] = ":" + self.buildarch
 
     def get_taskhash(self, tid, deps, dataCaches):
         if tid in self.lockedhashes:
@@ -246,15 +249,26 @@ class SignatureGeneratorOEBasicHashMixIn(object):
                         continue
                     f.write("    " + self.lockedpnmap[fn] + ":" + task + ":" + self.get_unihash(tid) + " \\\n")
                 f.write('    "\n')
-            f.write('SIGGEN_LOCKEDSIGS_TYPES_%s = "%s"' % (self.machine, " ".join(l)))
+            f.write('SIGGEN_LOCKEDSIGS_TYPES:%s = "%s"' % (self.machine, " ".join(l)))
+
+    def dump_siglist(self, sigfile, path_prefix_strip=None):
+        def strip_fn(fn):
+            nonlocal path_prefix_strip
+            if not path_prefix_strip:
+                return fn
+
+            fn_exp = fn.split(":")
+            if fn_exp[-1].startswith(path_prefix_strip):
+                fn_exp[-1] = fn_exp[-1][len(path_prefix_strip):]
+
+            return ":".join(fn_exp)
 
-    def dump_siglist(self, sigfile):
         with open(sigfile, "w") as f:
             tasks = []
             for taskitem in self.taskhash:
                 (fn, task) = taskitem.rsplit(":", 1)
                 pn = self.lockedpnmap[fn]
-                tasks.append((pn, task, fn, self.taskhash[taskitem]))
+                tasks.append((pn, task, strip_fn(fn), self.taskhash[taskitem]))
             for (pn, task, fn, taskhash) in sorted(tasks):
                 f.write('%s:%s %s %s\n' % (pn, task, fn, taskhash))
 
@@ -379,13 +393,13 @@ def find_siginfo(pn, taskname, taskhashlist, d):
             localdata.setVar('PV', '*')
             localdata.setVar('PR', '*')
             localdata.setVar('BB_TASKHASH', hashval)
+            localdata.setVar('SSTATE_CURRTASK', taskname[3:])
             swspec = localdata.getVar('SSTATE_SWSPEC')
             if taskname in ['do_fetch', 'do_unpack', 'do_patch', 'do_populate_lic', 'do_preconfigure'] and swspec:
                 localdata.setVar('SSTATE_PKGSPEC', '${SSTATE_SWSPEC}')
             elif pn.endswith('-native') or "-cross-" in pn or "-crosssdk-" in pn:
                 localdata.setVar('SSTATE_EXTRAPATH', "${NATIVELSBSTRING}/")
-            sstatename = taskname[3:]
-            filespec = '%s_%s.*.siginfo' % (localdata.getVar('SSTATE_PKG'), sstatename)
+            filespec = '%s.siginfo' % localdata.getVar('SSTATE_PKG')
 
             matchedfiles = glob.glob(filespec)
             for fullpath in matchedfiles:
@@ -440,7 +454,7 @@ def find_sstate_manifest(taskdata, taskdata2, taskname, d, multilibcache):
     elif "-cross-canadian" in taskdata:
         pkgarchs = ["${SDK_ARCH}_${SDK_ARCH}-${SDKPKGSUFFIX}"]
     elif "-cross-" in taskdata:
-        pkgarchs = ["${BUILD_ARCH}_${TARGET_ARCH}"]
+        pkgarchs = ["${BUILD_ARCH}"]
     elif "-crosssdk" in taskdata:
         pkgarchs = ["${BUILD_ARCH}_${SDK_ARCH}_${SDK_OS}"]
     else:
@@ -453,7 +467,7 @@ def find_sstate_manifest(taskdata, taskdata2, taskname, d, multilibcache):
         manifest = d2.expand("${SSTATE_MANIFESTS}/manifest-%s-%s.%s" % (pkgarch, taskdata, taskname))
         if os.path.exists(manifest):
             return manifest, d2
-    bb.error("Manifest %s not found in %s (variant '%s')?" % (manifest, d2.expand(" ".join(pkgarchs)), variant))
+    bb.fatal("Manifest %s not found in %s (variant '%s')?" % (manifest, d2.expand(" ".join(pkgarchs)), variant))
     return None, d2
 
 def OEOuthashBasic(path, sigfile, task, d):
@@ -467,6 +481,8 @@ def OEOuthashBasic(path, sigfile, task, d):
     import stat
     import pwd
     import grp
+    import re
+    import fnmatch
 
     def update_hash(s):
         s = s.encode('utf-8')
@@ -476,20 +492,37 @@ def OEOuthashBasic(path, sigfile, task, d):
 
     h = hashlib.sha256()
     prev_dir = os.getcwd()
+    corebase = d.getVar("COREBASE")
+    tmpdir = d.getVar("TMPDIR")
     include_owners = os.environ.get('PSEUDO_DISABLED') == '0'
     if "package_write_" in task or task == "package_qa":
         include_owners = False
     include_timestamps = False
+    include_root = True
     if task == "package":
-        include_timestamps = d.getVar('BUILD_REPRODUCIBLE_BINARIES') == '1'
-    extra_content = d.getVar('HASHEQUIV_HASH_VERSION')
+        include_timestamps = True
+        include_root = False
+    hash_version = d.getVar('HASHEQUIV_HASH_VERSION')
+    extra_sigdata = d.getVar("HASHEQUIV_EXTRA_SIGDATA")
+
+    filemaps = {}
+    for m in (d.getVar('SSTATE_HASHEQUIV_FILEMAP') or '').split():
+        entry = m.split(":")
+        if len(entry) != 3 or entry[0] != task:
+            continue
+        filemaps.setdefault(entry[1], [])
+        filemaps[entry[1]].append(entry[2])
 
     try:
         os.chdir(path)
+        basepath = os.path.normpath(path)
 
         update_hash("OEOuthashBasic\n")
-        if extra_content:
-            update_hash(extra_content + "\n")
+        if hash_version:
+            update_hash(hash_version + "\n")
+
+        if extra_sigdata:
+            update_hash(extra_sigdata + "\n")
 
         # It is only currently useful to get equivalent hashes for things that
         # can be restored from sstate. Since the sstate object is named using
@@ -534,21 +567,22 @@ def OEOuthashBasic(path, sigfile, task, d):
                 else:
                     add_perm(stat.S_IXUSR, 'x')
 
-                add_perm(stat.S_IRGRP, 'r')
-                add_perm(stat.S_IWGRP, 'w')
-                if stat.S_ISGID & s.st_mode:
-                    add_perm(stat.S_IXGRP, 's', 'S')
-                else:
-                    add_perm(stat.S_IXGRP, 'x')
+                if include_owners:
+                    # Group/other permissions are only relevant in pseudo context
+                    add_perm(stat.S_IRGRP, 'r')
+                    add_perm(stat.S_IWGRP, 'w')
+                    if stat.S_ISGID & s.st_mode:
+                        add_perm(stat.S_IXGRP, 's', 'S')
+                    else:
+                        add_perm(stat.S_IXGRP, 'x')
 
-                add_perm(stat.S_IROTH, 'r')
-                add_perm(stat.S_IWOTH, 'w')
-                if stat.S_ISVTX & s.st_mode:
-                    update_hash('t')
-                else:
-                    add_perm(stat.S_IXOTH, 'x')
+                    add_perm(stat.S_IROTH, 'r')
+                    add_perm(stat.S_IWOTH, 'w')
+                    if stat.S_ISVTX & s.st_mode:
+                        update_hash('t')
+                    else:
+                        add_perm(stat.S_IXOTH, 'x')
 
-                if include_owners:
                     try:
                         update_hash(" %10s" % pwd.getpwuid(s.st_uid).pw_name)
                         update_hash(" %10s" % grp.getgrgid(s.st_gid).gr_name)
@@ -567,8 +601,13 @@ def OEOuthashBasic(path, sigfile, task, d):
                 else:
                     update_hash(" " * 9)
 
+                filterfile = False
+                for entry in filemaps:
+                    if fnmatch.fnmatch(path, entry):
+                        filterfile = True
+
                 update_hash(" ")
-                if stat.S_ISREG(s.st_mode):
+                if stat.S_ISREG(s.st_mode) and not filterfile:
                     update_hash("%10d" % s.st_size)
                 else:
                     update_hash(" " * 10)
@@ -577,9 +616,24 @@ def OEOuthashBasic(path, sigfile, task, d):
                 fh = hashlib.sha256()
                 if stat.S_ISREG(s.st_mode):
                     # Hash file contents
-                    with open(path, 'rb') as d:
-                        for chunk in iter(lambda: d.read(4096), b""):
+                    if filterfile:
+                        # Need to ignore paths in crossscripts and postinst-useradd files.
+                        with open(path, 'rb') as d:
+                            chunk = d.read()
+                            chunk = chunk.replace(bytes(basepath, encoding='utf8'), b'')
+                            for entry in filemaps:
+                                if not fnmatch.fnmatch(path, entry):
+                                    continue
+                                for r in filemaps[entry]:
+                                    if r.startswith("regex-"):
+                                        chunk = re.sub(bytes(r[6:], encoding='utf8'), b'', chunk)
+                                    else:
+                                        chunk = chunk.replace(bytes(r, encoding='utf8'), b'')
                             fh.update(chunk)
+                    else:
+                        with open(path, 'rb') as d:
+                            for chunk in iter(lambda: d.read(4096), b""):
+                                fh.update(chunk)
                     update_hash(fh.hexdigest())
                 else:
                     update_hash(" " * len(fh.hexdigest()))
@@ -592,7 +646,8 @@ def OEOuthashBasic(path, sigfile, task, d):
                 update_hash("\n")
 
             # Process this directory and all its child files
-            process(root)
+            if include_root or root != ".":
+                process(root)
             for f in files:
                 if f == 'fixmepath':
                     continue
@@ -601,3 +656,5 @@ def OEOuthashBasic(path, sigfile, task, d):
         os.chdir(prev_dir)
 
     return h.hexdigest()
+
+
diff --git a/meta/lib/oe/terminal.py b/meta/lib/oe/terminal.py
index 8705804..8a3d84d 100644
--- a/meta/lib/oe/terminal.py
+++ b/meta/lib/oe/terminal.py
@@ -7,7 +7,6 @@ import logging
 import oe.classutils
 import shlex
 from bb.process import Popen, ExecutionError
-from distutils.version import LooseVersion
 
 logger = logging.getLogger('BitBake.OE.Terminal')
 
@@ -33,9 +32,10 @@ class Registry(oe.classutils.ClassRegistry):
 
 class Terminal(Popen, metaclass=Registry):
     def __init__(self, sh_cmd, title=None, env=None, d=None):
+        from subprocess import STDOUT
         fmt_sh_cmd = self.format_command(sh_cmd, title)
         try:
-            Popen.__init__(self, fmt_sh_cmd, env=env)
+            Popen.__init__(self, fmt_sh_cmd, env=env, stderr=STDOUT)
         except OSError as exc:
             import errno
             if exc.errno == errno.ENOENT:
@@ -88,10 +88,10 @@ class Konsole(XTerminal):
     def __init__(self, sh_cmd, title=None, env=None, d=None):
         # Check version
         vernum = check_terminal_version("konsole")
-        if vernum and LooseVersion(vernum) < '2.0.0':
+        if vernum and bb.utils.vercmp_string_op(vernum, "2.0.0", "<"):
             # Konsole from KDE 3.x
             self.command = 'konsole -T "{title}" -e {command}'
-        elif vernum and LooseVersion(vernum) < '16.08.1':
+        elif vernum and bb.utils.vercmp_string_op(vernum, "16.08.1", "<"):
             # Konsole pre 16.08.01 Has nofork
             self.command = 'konsole --nofork --workdir . -p tabtitle="{title}" -e {command}'
         XTerminal.__init__(self, sh_cmd, title, env, d)
@@ -165,7 +165,12 @@ class Tmux(Terminal):
         # devshells, if it's already there, add a new window to it.
         window_name = 'devshell-%i' % os.getpid()
 
-        self.command = 'tmux new -c "{{cwd}}" -d -s {0} -n {0} "{{command}}"'.format(window_name)
+        self.command = 'tmux new -c "{{cwd}}" -d -s {0} -n {0} "{{command}}"'
+        if not check_tmux_version('1.9'):
+            # `tmux new-session -c` was added in 1.9;
+            # older versions fail with that flag
+            self.command = 'tmux new -d -s {0} -n {0} "{{command}}"'
+        self.command = self.command.format(window_name)
         Terminal.__init__(self, sh_cmd, title, env, d)
 
         attach_cmd = 'tmux att -t {0}'.format(window_name)
@@ -187,7 +192,7 @@ class Custom(Terminal):
             Terminal.__init__(self, sh_cmd, title, env, d)
             logger.warning('Custom terminal was started.')
         else:
-            logger.debug(1, 'No custom terminal (OE_TERMINAL_CUSTOMCMD) set')
+            logger.debug('No custom terminal (OE_TERMINAL_CUSTOMCMD) set')
             raise UnsupportedTerminal('OE_TERMINAL_CUSTOMCMD not set')
 
 
@@ -209,13 +214,16 @@ def spawn_preferred(sh_cmd, title=None, env=None, d=None):
             spawn(terminal.name, sh_cmd, title, env, d)
             break
         except UnsupportedTerminal:
-            continue
+            pass
+        except:
+            bb.warn("Terminal %s is supported but did not start" % (terminal.name))
+    # when we've run out of options
     else:
         raise NoSupportedTerminals(get_cmd_list())
 
 def spawn(name, sh_cmd, title=None, env=None, d=None):
     """Spawn the specified terminal, by name"""
-    logger.debug(1, 'Attempting to spawn terminal "%s"', name)
+    logger.debug('Attempting to spawn terminal "%s"', name)
     try:
         terminal = Registry.registry[name]
     except KeyError:
@@ -252,13 +260,18 @@ def spawn(name, sh_cmd, title=None, env=None, d=None):
         except OSError:
            return
 
+def check_tmux_version(desired):
+    vernum = check_terminal_version("tmux")
+    if vernum and bb.utils.vercmp_string_op(vernum, desired, "<"):
+        return False
+    return vernum
+
 def check_tmux_pane_size(tmux):
     import subprocess as sub
     # On older tmux versions (<1.9), return false. The reason
     # is that there is no easy way to get the height of the active panel
     # on current window without nested formats (available from version 1.9)
-    vernum = check_terminal_version("tmux")
-    if vernum and LooseVersion(vernum) < '1.9':
+    if not check_tmux_version('1.9'):
         return False
     try:
         p = sub.Popen('%s list-panes -F "#{?pane_active,#{pane_height},}"' % tmux,
diff --git a/meta/lib/oe/utils.py b/meta/lib/oe/utils.py
index a84039f..9455aad 100644
--- a/meta/lib/oe/utils.py
+++ b/meta/lib/oe/utils.py
@@ -1,4 +1,6 @@
 #
+# Imported from openembedded-core
+#
 # SPDX-License-Identifier: GPL-2.0-only
 #
 
@@ -221,12 +223,12 @@ def packages_filter_out_system(d):
     PN-dbg PN-doc PN-locale-eb-gb removed.
     """
     pn = d.getVar('PN')
-    blacklist = [pn + suffix for suffix in ('', '-dbg', '-dev', '-doc', '-locale', '-staticdev', '-src')]
+    pkgfilter = [pn + suffix for suffix in ('', '-dbg', '-dev', '-doc', '-locale', '-staticdev', '-src')]
     localepkg = pn + "-locale-"
     pkgs = []
 
     for pkg in d.getVar('PACKAGES').split():
-        if pkg not in blacklist and localepkg not in pkg:
+        if pkg not in pkgfilter and localepkg not in pkg:
             pkgs.append(pkg)
     return pkgs
 
@@ -248,9 +250,9 @@ def trim_version(version, num_parts=2):
     trimmed = ".".join(parts[:num_parts])
     return trimmed
 
-def cpu_count(at_least=1):
+def cpu_count(at_least=1, at_most=64):
     cpus = len(os.sched_getaffinity(0))
-    return max(cpus, at_least)
+    return max(min(cpus, at_most), at_least)
 
 def execute_pre_post_process(d, cmds):
     if cmds is None:
@@ -344,7 +346,29 @@ def squashspaces(string):
     import re
     return re.sub(r"\s+", " ", string).strip()
 
-def format_pkg_list(pkg_dict, ret_format=None):
+def rprovides_map(pkgdata_dir, pkg_dict):
+    # Map file -> pkg provider
+    rprov_map = {}
+
+    for pkg in pkg_dict:
+        path_to_pkgfile = os.path.join(pkgdata_dir, 'runtime-reverse', pkg)
+        if not os.path.isfile(path_to_pkgfile):
+            continue
+        with open(path_to_pkgfile) as f:
+            for line in f:
+                if line.startswith('RPROVIDES') or line.startswith('FILERPROVIDES'):
+                    # List all components provided by pkg.
+                    # Exclude version strings, i.e. those starting with (
+                    provides = [x for x in line.split()[1:] if not x.startswith('(')]
+                    for prov in provides:
+                        if prov in rprov_map:
+                            rprov_map[prov].append(pkg)
+                        else:
+                            rprov_map[prov] = [pkg]
+
+    return rprov_map
+
+def format_pkg_list(pkg_dict, ret_format=None, pkgdata_dir=None):
     output = []
 
     if ret_format == "arch":
@@ -357,9 +381,15 @@ def format_pkg_list(pkg_dict, ret_format=None):
         for pkg in sorted(pkg_dict):
             output.append("%s %s %s" % (pkg, pkg_dict[pkg]["arch"], pkg_dict[pkg]["ver"]))
     elif ret_format == "deps":
+        rprov_map = rprovides_map(pkgdata_dir, pkg_dict)
         for pkg in sorted(pkg_dict):
             for dep in pkg_dict[pkg]["deps"]:
-                output.append("%s|%s" % (pkg, dep))
+                if dep in rprov_map:
+                    # There could be multiple providers within the image
+                    for pkg_provider in rprov_map[dep]:
+                        output.append("%s|%s * %s [RPROVIDES]" % (pkg, pkg_provider, dep))
+                else:
+                    output.append("%s|%s" % (pkg, dep))
     else:
         for pkg in sorted(pkg_dict):
             output.append(pkg)
@@ -455,8 +485,8 @@ from threading import Thread
 
 class ThreadedWorker(Thread):
     """Thread executing tasks from a given tasks queue"""
-    def __init__(self, tasks, worker_init, worker_end):
-        Thread.__init__(self)
+    def __init__(self, tasks, worker_init, worker_end, name=None):
+        Thread.__init__(self, name=name)
         self.tasks = tasks
         self.daemon = True
 
@@ -480,19 +510,19 @@ class ThreadedWorker(Thread):
             try:
                 func(self, *args, **kargs)
             except Exception as e:
-                print(e)
+                # Eat all exceptions
+                bb.mainlogger.debug("Worker task raised %s" % e, exc_info=e)
             finally:
                 self.tasks.task_done()
 
 class ThreadedPool:
     """Pool of threads consuming tasks from a queue"""
-    def __init__(self, num_workers, num_tasks, worker_init=None,
-            worker_end=None):
+    def __init__(self, num_workers, num_tasks, worker_init=None, worker_end=None, name="ThreadedPool-"):
         self.tasks = Queue(num_tasks)
         self.workers = []
 
-        for _ in range(num_workers):
-            worker = ThreadedWorker(self.tasks, worker_init, worker_end)
+        for i in range(num_workers):
+            worker = ThreadedWorker(self.tasks, worker_init, worker_end, name=name + str(i))
             self.workers.append(worker)
 
     def start(self):
@@ -509,17 +539,6 @@ class ThreadedPool:
         for worker in self.workers:
             worker.join()
 
-def write_ld_so_conf(d):
-    # Some utils like prelink may not have the correct target library paths
-    # so write an ld.so.conf to help them
-    ldsoconf = d.expand("${STAGING_DIR_TARGET}${sysconfdir}/ld.so.conf")
-    if os.path.exists(ldsoconf):
-        bb.utils.remove(ldsoconf)
-    bb.utils.mkdirhier(os.path.dirname(ldsoconf))
-    with open(ldsoconf, "w") as f:
-        f.write(d.getVar("base_libdir") + '\n')
-        f.write(d.getVar("libdir") + '\n')
-
 class ImageQAFailed(Exception):
     def __init__(self, description, name=None, logfile=None):
         self.description = description
-- 
2.34.1


  parent reply	other threads:[~2023-01-25 19:24 UTC|newest]

Thread overview: 29+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2023-01-25 19:23 [PATCH v8 00/20] Migrate to Bitbake 2.0 Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 01/20] meta: change deprecated parse calls Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 02/20] scripts/contrib: add override conversion script Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 03/20] scripts/contrib: configure " Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 04/20] meta-isar: set default branch names Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 05/20] meta: remove non recommended syntax Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 06/20] bitbake: update to Bitbake 2.0.5 Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 07/20] meta: update bitbake variables Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 08/20] bitbake.conf: align hash vars with openembedded Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 09/20] meta: mark network and sudo tasks Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 10/20] meta: update overrides syntax Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 11/20] sstate: update bbclass Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 12/20] bitbake.conf: declare default XZ and ZSTD options Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 13/20] Revert "devshell: Use different termination test to avoid warnings" Anton Mikanovich
2023-01-25 19:23 ` Anton Mikanovich [this message]
2023-01-25 19:23 ` [PATCH v8 15/20] Revert "Revert "devshell: Use different termination test to avoid warnings"" Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 16/20] CI: adapt tests to syntax change Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 17/20] isar-sstate: adapt sstate maintenance script Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 18/20] doc: require zstd tool Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 19/20] RECIPE-API-CHANGELOG: add tips after bitbake version update Anton Mikanovich
2023-01-25 19:23 ` [PATCH v8 20/20] docs: update override syntax Anton Mikanovich
2023-01-25 23:43 ` [PATCH v8 00/20] Migrate to Bitbake 2.0 Roberto A. Foglietta
2023-01-26  7:29   ` Anton Mikanovich
2023-01-26 13:23     ` Roberto A. Foglietta
2023-01-26 19:59       ` Henning Schild
2023-01-27  4:09         ` Roberto A. Foglietta
2023-01-31 11:26 ` Uladzimir Bely
2023-02-01  6:17 ` Uladzimir Bely
2023-02-02  9:02   ` Florian Bezdeka

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=20230125192337.86869-15-amikan@ilbers.de \
    --to=amikan@ilbers.de \
    --cc=isar-users@googlegroups.com \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox