ca13e678ae
* Fix unit test parametrize order on Python 3.5. (cherry picked from commit53b230ca74
) * Fix ansible-test unit test execution. (#45772) * Fix ansible-test units requirements install. * Run unit tests as unprivileged user under Docker. (cherry picked from commit379a7f4f5a
) * Run unit tests in parallel. (#45812) (cherry picked from commitabe8e4c9e8
) * Minor fixes for unit test delegation. (cherry picked from commitbe199cfe90
) * add support for opening shell on remote Windows host (#43919) * add support for opening shell on remote Windows host * added arg completion and fix sanity check * remove uneeded arg (cherry picked from commit6ca4ea0c1f
) * Block network access for unit tests in docker. (cherry picked from commit99cac99cbc
) * Make ansible-test available in the bin directory. (#45876) (cherry picked from commitf3d1f9544b
) * Support comments in ansible-test flat files. (cherry picked from commit5a3000af19
) * Fix incorrect use of subprocess.CalledProcessError (#45890) (cherry picked from commit24dd87bd0a
) * Improve ansible-test match error handling. (cherry picked from commit2056c981ae
) * Improve error handling for docs-build test. (cherry picked from commit2148999048
) * Bug fixes and cleanup for ansible-test. (#45991) * Remove unused imports. * Clean up ConfigParser usage in ansible-test. * Fix bare except statements in ansible-test. * Miscellaneous cleanup from PyCharm inspections. * Enable pylint no-self-use for ansible-test. * Remove obsolete pylint ignores for Python 3.7. * Fix shellcheck issuers under newer shellcheck. * Use newer path for ansible-test. * Fix issues in code-smell tests. (cherry picked from commitac492476e5
) * Fix integration test library search path. This prevents tests from loading modules outside the source tree, which could result in testing the wrong module if a system-wide install is present, or custom modules exist. (cherry picked from commitd603cd41fe
) * Update default container to version 1.2.0. (cherry picked from commitd478a4c3f6
) (cherry picked from commit21c4eb8db5
) * Fix ansible-test docker python version handling. This removes the old name based version detection behavior and uses versions defined in the docker completion file instead, as the new containers do not follow the old naming scheme. (cherry picked from commit54937ba784
) * Reduce noise in docs-build test failures. (cherry picked from commit4085d01617
) * Fix ansible-test encoding issues for exceptions. (cherry picked from commit0d7a156319
) * Fix ansible-test multi-group smoke test handling. (#46363) * Fix ansible-test smoke tests across groups. * Fix ansible-test list arg defaults. * Fix ansible-test require and exclude delegation. * Fix detection of Windows specific changes. * Add minimal Windows testing for Python 3.7. (cherry picked from commite53390b3b1
) * Use default-test-container version 1.3.0. (cherry picked from commit6d9be66418
) * Add file exists check in integration-aliases test. (cherry picked from commit33a8be9109
) * Improve ansible-test environment checking between tests. (#46459) * Add unified diff output to environment validation. This makes it easier to see where the environment changed. * Compare Python interpreters by version to pip shebangs. This helps expose cases where pip executables use a different Python interpreter than is expected. * Query `pip.__version__` instead of using `pip --version`. This is a much faster way to query the pip version. It also more closely matches how we invoke pip within ansible-test. * Remove redundant environment scan between tests. This reuses the environment scan from the end of the previous test as the basis for comparison during the next test. (cherry picked from commit0dc7f38787
) * Add symlinks sanity test. (#46467) * Add symlinks sanity test. * Replace legacy test symlinks with actual content. * Remove dir symlink from template_jinja2_latest. * Update import test to use generated library dir. * Fix copy test symlink setup. (cherry picked from commite2b6047514
) * Fix parametrize warning in unit tests. (cherry picked from commit1a28898a00
) * Update MANIFEST.in (#46502) * Update MANIFEST.in: - Remove unnecessary prune. - Include files needed by tests. - Exclude botmeta sanity test. These changes permit sanity tests to pass on sdist output. (cherry picked from commitcbb49f66ec
) * Fix unit tests which modify the source tree. (#45763) * Fix CNOS unit test log usage. * Use temp dir for Galaxy unit tests. * Write to temp files in interfaces_file unit test. * Fix log placement in netapp_e_ldap unit test. (cherry picked from commit0686450cae
) * Fix ansible-test custom docker image traceback. (cherry picked from commit712ad9ed64
) * ansible-test: Create public key creating Windows targets (#43760) * ansible-test: Create public key creating Windows targets * Changed to always set SSH Key for Windows hosts (cherry picked from commitadc0efe10c
) * Fix and re-enable sts_assume_role integration tests (#46026) * Fix the STS assume role error message assertion when the role to assume does not exist. (cherry picked from commit18dc928e28
) * Fix ACI unit test on Python 3.7.0. The previous logic was only needed for pre-release versions of 3.7. (cherry picked from commitc0bf9815c9
) * Remove placeboify from unit tests that are not calling AWS (i.e. creating a recording) (#45754) (cherry picked from commit2167ce6cb6
) * Update sanity test ignore entries.
607 lines
19 KiB
Python
607 lines
19 KiB
Python
"""Test target identification, iteration and inclusion/exclusion."""
|
|
|
|
from __future__ import absolute_import, print_function
|
|
|
|
import collections
|
|
import os
|
|
import re
|
|
import errno
|
|
import itertools
|
|
import abc
|
|
import sys
|
|
|
|
from lib.util import (
|
|
ApplicationError,
|
|
read_lines_without_comments,
|
|
)
|
|
|
|
MODULE_EXTENSIONS = '.py', '.ps1'
|
|
|
|
|
|
def find_target_completion(target_func, prefix):
|
|
"""
|
|
:type target_func: () -> collections.Iterable[CompletionTarget]
|
|
:type prefix: unicode
|
|
:rtype: list[str]
|
|
"""
|
|
try:
|
|
targets = target_func()
|
|
if sys.version_info[0] == 2:
|
|
prefix = prefix.encode()
|
|
short = os.environ.get('COMP_TYPE') == '63' # double tab completion from bash
|
|
matches = walk_completion_targets(targets, prefix, short)
|
|
return matches
|
|
except Exception as ex: # pylint: disable=locally-disabled, broad-except
|
|
return [u'%s' % ex]
|
|
|
|
|
|
def walk_completion_targets(targets, prefix, short=False):
|
|
"""
|
|
:type targets: collections.Iterable[CompletionTarget]
|
|
:type prefix: str
|
|
:type short: bool
|
|
:rtype: tuple[str]
|
|
"""
|
|
aliases = set(alias for target in targets for alias in target.aliases)
|
|
|
|
if prefix.endswith('/') and prefix in aliases:
|
|
aliases.remove(prefix)
|
|
|
|
matches = [alias for alias in aliases if alias.startswith(prefix) and '/' not in alias[len(prefix):-1]]
|
|
|
|
if short:
|
|
offset = len(os.path.dirname(prefix))
|
|
if offset:
|
|
offset += 1
|
|
relative_matches = [match[offset:] for match in matches if len(match) > offset]
|
|
if len(relative_matches) > 1:
|
|
matches = relative_matches
|
|
|
|
return tuple(sorted(matches))
|
|
|
|
|
|
def walk_internal_targets(targets, includes=None, excludes=None, requires=None):
|
|
"""
|
|
:type targets: collections.Iterable[T <= CompletionTarget]
|
|
:type includes: list[str]
|
|
:type excludes: list[str]
|
|
:type requires: list[str]
|
|
:rtype: tuple[T <= CompletionTarget]
|
|
"""
|
|
targets = tuple(targets)
|
|
|
|
include_targets = sorted(filter_targets(targets, includes, errors=True, directories=False), key=lambda t: t.name)
|
|
|
|
if requires:
|
|
require_targets = set(filter_targets(targets, requires, errors=True, directories=False))
|
|
include_targets = [target for target in include_targets if target in require_targets]
|
|
|
|
if excludes:
|
|
list(filter_targets(targets, excludes, errors=True, include=False, directories=False))
|
|
|
|
internal_targets = set(filter_targets(include_targets, excludes, errors=False, include=False, directories=False))
|
|
return tuple(sorted(internal_targets, key=lambda t: t.name))
|
|
|
|
|
|
def walk_external_targets(targets, includes=None, excludes=None, requires=None):
|
|
"""
|
|
:type targets: collections.Iterable[CompletionTarget]
|
|
:type includes: list[str]
|
|
:type excludes: list[str]
|
|
:type requires: list[str]
|
|
:rtype: tuple[CompletionTarget], tuple[CompletionTarget]
|
|
"""
|
|
targets = tuple(targets)
|
|
|
|
if requires:
|
|
include_targets = list(filter_targets(targets, includes, errors=True, directories=False))
|
|
require_targets = set(filter_targets(targets, requires, errors=True, directories=False))
|
|
includes = [target.name for target in include_targets if target in require_targets]
|
|
|
|
if includes:
|
|
include_targets = sorted(filter_targets(targets, includes, errors=True), key=lambda t: t.name)
|
|
else:
|
|
include_targets = []
|
|
else:
|
|
include_targets = sorted(filter_targets(targets, includes, errors=True), key=lambda t: t.name)
|
|
|
|
if excludes:
|
|
exclude_targets = sorted(filter_targets(targets, excludes, errors=True), key=lambda t: t.name)
|
|
else:
|
|
exclude_targets = []
|
|
|
|
previous = None
|
|
include = []
|
|
for target in include_targets:
|
|
if isinstance(previous, DirectoryTarget) and isinstance(target, DirectoryTarget) \
|
|
and previous.name == target.name:
|
|
previous.modules = tuple(set(previous.modules) | set(target.modules))
|
|
else:
|
|
include.append(target)
|
|
previous = target
|
|
|
|
previous = None
|
|
exclude = []
|
|
for target in exclude_targets:
|
|
if isinstance(previous, DirectoryTarget) and isinstance(target, DirectoryTarget) \
|
|
and previous.name == target.name:
|
|
previous.modules = tuple(set(previous.modules) | set(target.modules))
|
|
else:
|
|
exclude.append(target)
|
|
previous = target
|
|
|
|
return tuple(include), tuple(exclude)
|
|
|
|
|
|
def filter_targets(targets, patterns, include=True, directories=True, errors=True):
|
|
"""
|
|
:type targets: collections.Iterable[CompletionTarget]
|
|
:type patterns: list[str]
|
|
:type include: bool
|
|
:type directories: bool
|
|
:type errors: bool
|
|
:rtype: collections.Iterable[CompletionTarget]
|
|
"""
|
|
unmatched = set(patterns or ())
|
|
compiled_patterns = dict((p, re.compile('^%s$' % p)) for p in patterns) if patterns else None
|
|
|
|
for target in targets:
|
|
matched_directories = set()
|
|
match = False
|
|
|
|
if patterns:
|
|
for alias in target.aliases:
|
|
for pattern in patterns:
|
|
if compiled_patterns[pattern].match(alias):
|
|
match = True
|
|
|
|
try:
|
|
unmatched.remove(pattern)
|
|
except KeyError:
|
|
pass
|
|
|
|
if alias.endswith('/'):
|
|
if target.base_path and len(target.base_path) > len(alias):
|
|
matched_directories.add(target.base_path)
|
|
else:
|
|
matched_directories.add(alias)
|
|
elif include:
|
|
match = True
|
|
if not target.base_path:
|
|
matched_directories.add('.')
|
|
for alias in target.aliases:
|
|
if alias.endswith('/'):
|
|
if target.base_path and len(target.base_path) > len(alias):
|
|
matched_directories.add(target.base_path)
|
|
else:
|
|
matched_directories.add(alias)
|
|
|
|
if match != include:
|
|
continue
|
|
|
|
if directories and matched_directories:
|
|
yield DirectoryTarget(sorted(matched_directories, key=len)[0], target.modules)
|
|
else:
|
|
yield target
|
|
|
|
if errors:
|
|
if unmatched:
|
|
raise TargetPatternsNotMatched(unmatched)
|
|
|
|
|
|
def walk_module_targets():
|
|
"""
|
|
:rtype: collections.Iterable[TestTarget]
|
|
"""
|
|
path = 'lib/ansible/modules'
|
|
|
|
for target in walk_test_targets(path, path + '/', extensions=MODULE_EXTENSIONS):
|
|
if not target.module:
|
|
continue
|
|
|
|
yield target
|
|
|
|
|
|
def walk_units_targets():
|
|
"""
|
|
:rtype: collections.Iterable[TestTarget]
|
|
"""
|
|
return walk_test_targets(path='test/units', module_path='test/units/modules/', extensions=('.py',), prefix='test_')
|
|
|
|
|
|
def walk_compile_targets():
|
|
"""
|
|
:rtype: collections.Iterable[TestTarget]
|
|
"""
|
|
return walk_test_targets(module_path='lib/ansible/modules/', extensions=('.py',), extra_dirs=('bin',))
|
|
|
|
|
|
def walk_sanity_targets():
|
|
"""
|
|
:rtype: collections.Iterable[TestTarget]
|
|
"""
|
|
return walk_test_targets(module_path='lib/ansible/modules/')
|
|
|
|
|
|
def walk_posix_integration_targets(include_hidden=False):
|
|
"""
|
|
:type include_hidden: bool
|
|
:rtype: collections.Iterable[IntegrationTarget]
|
|
"""
|
|
for target in walk_integration_targets():
|
|
if 'posix/' in target.aliases or (include_hidden and 'hidden/posix/' in target.aliases):
|
|
yield target
|
|
|
|
|
|
def walk_network_integration_targets(include_hidden=False):
|
|
"""
|
|
:type include_hidden: bool
|
|
:rtype: collections.Iterable[IntegrationTarget]
|
|
"""
|
|
for target in walk_integration_targets():
|
|
if 'network/' in target.aliases or (include_hidden and 'hidden/network/' in target.aliases):
|
|
yield target
|
|
|
|
|
|
def walk_windows_integration_targets(include_hidden=False):
|
|
"""
|
|
:type include_hidden: bool
|
|
:rtype: collections.Iterable[IntegrationTarget]
|
|
"""
|
|
for target in walk_integration_targets():
|
|
if 'windows/' in target.aliases or (include_hidden and 'hidden/windows/' in target.aliases):
|
|
yield target
|
|
|
|
|
|
def walk_integration_targets():
|
|
"""
|
|
:rtype: collections.Iterable[IntegrationTarget]
|
|
"""
|
|
path = 'test/integration/targets'
|
|
modules = frozenset(t.module for t in walk_module_targets())
|
|
paths = sorted(os.path.join(path, p) for p in os.listdir(path))
|
|
prefixes = load_integration_prefixes()
|
|
|
|
for path in paths:
|
|
if os.path.isdir(path):
|
|
yield IntegrationTarget(path, modules, prefixes)
|
|
|
|
|
|
def load_integration_prefixes():
|
|
"""
|
|
:rtype: dict[str, str]
|
|
"""
|
|
path = 'test/integration'
|
|
names = sorted(f for f in os.listdir(path) if os.path.splitext(f)[0] == 'target-prefixes')
|
|
prefixes = {}
|
|
|
|
for name in names:
|
|
prefix = os.path.splitext(name)[1][1:]
|
|
with open(os.path.join(path, name), 'r') as prefix_fd:
|
|
prefixes.update(dict((k, prefix) for k in prefix_fd.read().splitlines()))
|
|
|
|
return prefixes
|
|
|
|
|
|
def walk_test_targets(path=None, module_path=None, extensions=None, prefix=None, extra_dirs=None):
|
|
"""
|
|
:type path: str | None
|
|
:type module_path: str | None
|
|
:type extensions: tuple[str] | None
|
|
:type prefix: str | None
|
|
:type extra_dirs: tuple[str] | None
|
|
:rtype: collections.Iterable[TestTarget]
|
|
"""
|
|
for root, _, file_names in os.walk(path or '.', topdown=False):
|
|
if root.endswith('/__pycache__'):
|
|
continue
|
|
|
|
if '/.tox/' in root:
|
|
continue
|
|
|
|
if path is None:
|
|
root = root[2:]
|
|
|
|
if root.startswith('.') and root != '.github':
|
|
continue
|
|
|
|
for file_name in file_names:
|
|
name, ext = os.path.splitext(os.path.basename(file_name))
|
|
|
|
if name.startswith('.'):
|
|
continue
|
|
|
|
if extensions and ext not in extensions:
|
|
continue
|
|
|
|
if prefix and not name.startswith(prefix):
|
|
continue
|
|
|
|
file_path = os.path.join(root, file_name)
|
|
|
|
if os.path.islink(file_path):
|
|
continue
|
|
|
|
yield TestTarget(file_path, module_path, prefix, path)
|
|
|
|
if extra_dirs:
|
|
for extra_dir in extra_dirs:
|
|
file_names = os.listdir(extra_dir)
|
|
|
|
for file_name in file_names:
|
|
file_path = os.path.join(extra_dir, file_name)
|
|
|
|
if os.path.isfile(file_path) and not os.path.islink(file_path):
|
|
yield TestTarget(file_path, module_path, prefix, path)
|
|
|
|
|
|
def analyze_integration_target_dependencies(integration_targets):
|
|
"""
|
|
:type integration_targets: list[IntegrationTarget]
|
|
:rtype: dict[str,set[str]]
|
|
"""
|
|
hidden_role_target_names = set(t.name for t in integration_targets if t.type == 'role' and 'hidden/' in t.aliases)
|
|
normal_role_targets = [t for t in integration_targets if t.type == 'role' and 'hidden/' not in t.aliases]
|
|
dependencies = collections.defaultdict(set)
|
|
|
|
# handle setup dependencies
|
|
for target in integration_targets:
|
|
for setup_target_name in target.setup_always + target.setup_once:
|
|
dependencies[setup_target_name].add(target.name)
|
|
|
|
# intentionally primitive analysis of role meta to avoid a dependency on pyyaml
|
|
for role_target in normal_role_targets:
|
|
meta_dir = os.path.join(role_target.path, 'meta')
|
|
|
|
if not os.path.isdir(meta_dir):
|
|
continue
|
|
|
|
meta_paths = sorted([os.path.join(meta_dir, name) for name in os.listdir(meta_dir)])
|
|
|
|
for meta_path in meta_paths:
|
|
if os.path.exists(meta_path):
|
|
with open(meta_path, 'r') as meta_fd:
|
|
meta_lines = meta_fd.read().splitlines()
|
|
|
|
for meta_line in meta_lines:
|
|
if re.search(r'^ *#.*$', meta_line):
|
|
continue
|
|
|
|
if not meta_line.strip():
|
|
continue
|
|
|
|
for hidden_target_name in hidden_role_target_names:
|
|
if hidden_target_name in meta_line:
|
|
dependencies[hidden_target_name].add(role_target.name)
|
|
|
|
return dependencies
|
|
|
|
|
|
class CompletionTarget(object):
|
|
"""Command-line argument completion target base class."""
|
|
__metaclass__ = abc.ABCMeta
|
|
|
|
def __init__(self):
|
|
self.name = None
|
|
self.path = None
|
|
self.base_path = None
|
|
self.modules = tuple()
|
|
self.aliases = tuple()
|
|
|
|
def __eq__(self, other):
|
|
if isinstance(other, CompletionTarget):
|
|
return self.__repr__() == other.__repr__()
|
|
|
|
return False
|
|
|
|
def __ne__(self, other):
|
|
return not self.__eq__(other)
|
|
|
|
def __lt__(self, other):
|
|
return self.name.__lt__(other.name)
|
|
|
|
def __gt__(self, other):
|
|
return self.name.__gt__(other.name)
|
|
|
|
def __hash__(self):
|
|
return hash(self.__repr__())
|
|
|
|
def __repr__(self):
|
|
if self.modules:
|
|
return '%s (%s)' % (self.name, ', '.join(self.modules))
|
|
|
|
return self.name
|
|
|
|
|
|
class DirectoryTarget(CompletionTarget):
|
|
"""Directory target."""
|
|
def __init__(self, path, modules):
|
|
"""
|
|
:type path: str
|
|
:type modules: tuple[str]
|
|
"""
|
|
super(DirectoryTarget, self).__init__()
|
|
|
|
self.name = path
|
|
self.path = path
|
|
self.modules = modules
|
|
|
|
|
|
class TestTarget(CompletionTarget):
|
|
"""Generic test target."""
|
|
def __init__(self, path, module_path, module_prefix, base_path):
|
|
"""
|
|
:type path: str
|
|
:type module_path: str | None
|
|
:type module_prefix: str | None
|
|
:type base_path: str
|
|
"""
|
|
super(TestTarget, self).__init__()
|
|
|
|
self.name = path
|
|
self.path = path
|
|
self.base_path = base_path + '/' if base_path else None
|
|
|
|
name, ext = os.path.splitext(os.path.basename(self.path))
|
|
|
|
if module_path and path.startswith(module_path) and name != '__init__' and ext in MODULE_EXTENSIONS:
|
|
self.module = name[len(module_prefix or ''):].lstrip('_')
|
|
self.modules = self.module,
|
|
else:
|
|
self.module = None
|
|
self.modules = tuple()
|
|
|
|
aliases = [self.path, self.module]
|
|
parts = self.path.split('/')
|
|
|
|
for i in range(1, len(parts)):
|
|
alias = '%s/' % '/'.join(parts[:i])
|
|
aliases.append(alias)
|
|
|
|
aliases = [a for a in aliases if a]
|
|
|
|
self.aliases = tuple(sorted(aliases))
|
|
|
|
|
|
class IntegrationTarget(CompletionTarget):
|
|
"""Integration test target."""
|
|
non_posix = frozenset((
|
|
'network',
|
|
'windows',
|
|
))
|
|
|
|
categories = frozenset(non_posix | frozenset((
|
|
'posix',
|
|
'module',
|
|
'needs',
|
|
'skip',
|
|
)))
|
|
|
|
def __init__(self, path, modules, prefixes):
|
|
"""
|
|
:type path: str
|
|
:type modules: frozenset[str]
|
|
:type prefixes: dict[str, str]
|
|
"""
|
|
super(IntegrationTarget, self).__init__()
|
|
|
|
self.name = os.path.basename(path)
|
|
self.path = path
|
|
|
|
# script_path and type
|
|
|
|
contents = sorted(os.listdir(path))
|
|
|
|
runme_files = tuple(c for c in contents if os.path.splitext(c)[0] == 'runme')
|
|
test_files = tuple(c for c in contents if os.path.splitext(c)[0] == 'test')
|
|
|
|
self.script_path = None
|
|
|
|
if runme_files:
|
|
self.type = 'script'
|
|
self.script_path = os.path.join(path, runme_files[0])
|
|
elif test_files:
|
|
self.type = 'special'
|
|
elif os.path.isdir(os.path.join(path, 'tasks')) or os.path.isdir(os.path.join(path, 'defaults')):
|
|
self.type = 'role'
|
|
else:
|
|
self.type = 'unknown'
|
|
|
|
# static_aliases
|
|
|
|
try:
|
|
aliases_path = os.path.join(path, 'aliases')
|
|
static_aliases = tuple(read_lines_without_comments(aliases_path, remove_blank_lines=True))
|
|
except IOError as ex:
|
|
if ex.errno != errno.ENOENT:
|
|
raise
|
|
static_aliases = tuple()
|
|
|
|
# modules
|
|
|
|
if self.name in modules:
|
|
module_name = self.name
|
|
elif self.name.startswith('win_') and self.name[4:] in modules:
|
|
module_name = self.name[4:]
|
|
else:
|
|
module_name = None
|
|
|
|
self.modules = tuple(sorted(a for a in static_aliases + tuple([module_name]) if a in modules))
|
|
|
|
# groups
|
|
|
|
groups = [self.type]
|
|
groups += [a for a in static_aliases if a not in modules]
|
|
groups += ['module/%s' % m for m in self.modules]
|
|
|
|
if not self.modules:
|
|
groups.append('non_module')
|
|
|
|
if 'destructive' not in groups:
|
|
groups.append('non_destructive')
|
|
|
|
if '_' in self.name:
|
|
prefix = self.name[:self.name.find('_')]
|
|
else:
|
|
prefix = None
|
|
|
|
if prefix in prefixes:
|
|
group = prefixes[prefix]
|
|
|
|
if group != prefix:
|
|
group = '%s/%s' % (group, prefix)
|
|
|
|
groups.append(group)
|
|
|
|
if self.name.startswith('win_'):
|
|
groups.append('windows')
|
|
|
|
if self.name.startswith('connection_'):
|
|
groups.append('connection')
|
|
|
|
if self.name.startswith('setup_') or self.name.startswith('prepare_'):
|
|
groups.append('hidden')
|
|
|
|
if self.type not in ('script', 'role'):
|
|
groups.append('hidden')
|
|
|
|
for group in itertools.islice(groups, 0, len(groups)):
|
|
if '/' in group:
|
|
parts = group.split('/')
|
|
for i in range(1, len(parts)):
|
|
groups.append('/'.join(parts[:i]))
|
|
|
|
if not any(g in self.non_posix for g in groups):
|
|
groups.append('posix')
|
|
|
|
# aliases
|
|
|
|
aliases = [self.name] + \
|
|
['%s/' % g for g in groups] + \
|
|
['%s/%s' % (g, self.name) for g in groups if g not in self.categories]
|
|
|
|
if 'hidden/' in aliases:
|
|
aliases = ['hidden/'] + ['hidden/%s' % a for a in aliases if not a.startswith('hidden/')]
|
|
|
|
self.aliases = tuple(sorted(set(aliases)))
|
|
|
|
# configuration
|
|
|
|
self.setup_once = tuple(sorted(set(g.split('/')[2] for g in groups if g.startswith('setup/once/'))))
|
|
self.setup_always = tuple(sorted(set(g.split('/')[2] for g in groups if g.startswith('setup/always/'))))
|
|
|
|
|
|
class TargetPatternsNotMatched(ApplicationError):
|
|
"""One or more targets were not matched when a match was required."""
|
|
def __init__(self, patterns):
|
|
"""
|
|
:type patterns: set[str]
|
|
"""
|
|
self.patterns = sorted(patterns)
|
|
|
|
if len(patterns) > 1:
|
|
message = 'Target patterns not matched:\n%s' % '\n'.join(self.patterns)
|
|
else:
|
|
message = 'Target pattern not matched: %s' % self.patterns[0]
|
|
|
|
super(TargetPatternsNotMatched, self).__init__(message)
|