diff --git a/.pylintrc b/.pylintrc11 similarity index 99% rename from .pylintrc rename to .pylintrc11 index e31fe3d2..3a0c2f06 100644 --- a/.pylintrc +++ b/.pylintrc11 @@ -45,7 +45,7 @@ disable=attribute-defined-outside-init,arguments-differ, too-many-locals,too-few-public-methods,too-many-instance-attributes, too-many-arguments,too-many-branches,too-many-public-methods, too-many-return-statements, multiple-statements,abstract-method,F0401,no-member,non-parent-init-called, - maybe-no-member,abstract-class-little-used + maybe-no-member,abstract-class-little-used,bad-option-value # F0401: http://www.logilab.org/ticket/9386 diff --git a/.pylintrc19 b/.pylintrc19 new file mode 100644 index 00000000..2ba4d6ce --- /dev/null +++ b/.pylintrc19 @@ -0,0 +1,285 @@ +[MASTER] + +# Specify a configuration file. +#rcfile= + +# Python code to execute, usually for sys.path manipulation such as +# pygtk.require(). +#init-hook= + +# Profiled execution. +profile=no + +# Add files or directories to the blacklist. They should be base names, not +# paths. +ignore=CVS + +# Pickle collected data for later comparisons. +persistent=no + +# List of plugins (as comma separated values of python modules names) to load, +# usually to register additional checkers. +load-plugins= + + +[MESSAGES CONTROL] + +# Enable the message, report, category or checker with the given id(s). You can +# either give multiple identifier separated by comma (,) or put this option +# multiple time. See also the "--disable" option for examples. +#enable= + +# Disable the message, report, category or checker with the given id(s). You +# can either give multiple identifiers separated by comma (,) or put this +# option multiple times (only on the command line, not in the configuration +# file where it should appear only once).You can also use "--disable=all" to +# disable everything first and then reenable specific checks. For example, if +# you want to run only the similarities checker, you can use "--disable=all +# --enable=similarities". If you want to run only the classes checker, but have +# no Warning level messages displayed, use"--disable=all --enable=classes +# --disable=W" +disable=attribute-defined-outside-init,arguments-differ, + bare-except,global-statement,protected-access,redefined-outer-name, + unused-argument,star-args,pointless-string-statement,old-style-class, + too-many-lines,missing-docstring,no-init,no-self-use,too-many-statements, + too-many-locals,too-few-public-methods,too-many-instance-attributes, + too-many-arguments,too-many-branches,too-many-public-methods, too-many-return-statements, + multiple-statements,abstract-method,F0401,no-member,non-parent-init-called, + maybe-no-member,abstract-class-little-used,bad-option-value,bad-continuation, + wrong-import-position,len-as-condition,wrong-import-order,ungrouped-imports, + multiple-imports,no-else-return,cell-var-from-loop,inconsistent-return-statements, + too-many-nested-blocks,consider-using-enumerate, + invalid-unary-operand-type,not-callable,unsubscriptable-object + +# F0401: http://www.logilab.org/ticket/9386 + +[REPORTS] + +# Set the output format. Available formats are text, parseable, colorized, msvs +# (visual studio) and html. You can also give a reporter class, eg +# mypackage.mymodule.MyReporterClass. +output-format=text + +# Put messages in a separate file for each module / package specified on the +# command line instead of printing them on stdout. Reports (if any) will be +# written in a file name "pylint_global.[txt|html]". +files-output=no + +# Tells whether to display a full report or only the messages +reports=no + +# Python expression which should return a note less than 10 (10 is the highest +# note). You have access to the variables errors warning, statement which +# respectively contain the number of errors / warnings messages and the total +# number of statements analyzed. This is used by the global evaluation report +# (RP0004). +evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10) + +# Add a comment according to your evaluation note. This is used by the global +# evaluation report (RP0004). +comment=no + +# Template used to display messages. This is a python new-style format string +# used to format the massage information. See doc for all details +#msg-template= + + +[BASIC] + +# Required attributes for module, separated by a comma +required-attributes= + +# List of builtins function names that should not be used, separated by a comma +bad-functions=filter,apply,input + +# Regular expression which should only match correct module names +module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$ + +# Regular expression which should only match correct module level names +const-rgx=[a-zA-Z0-9_]{2,30}$ + +# Regular expression which should only match correct class names +class-rgx=[A-Z_][a-zA-Z0-9]+$ + +# Regular expression which should only match correct function names +function-rgx=[a-z_][a-zA-Z0-9_]{1,40}$ + +# Regular expression which should only match correct method names +method-rgx=[a-z_][a-zA-Z0-9_]{2,40}$ + +# Regular expression which should only match correct instance attribute names +attr-rgx=[a-z_][a-zA-Z0-9_]{1,30}$ + +# Regular expression which should only match correct argument names +argument-rgx=[a-z_][a-zA-Z0-9_]{0,30}$ + +# Regular expression which should only match correct variable names +variable-rgx=[a-z_][a-zA-Z0-9_]{0,30}$ + +# Regular expression which should only match correct attribute names in class +# bodies +class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$ + +# Regular expression which should only match correct list comprehension / +# generator expression variable names +inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$ + +# Good variable names which should always be accepted, separated by a comma +good-names=i,j,k,ex,Run,_ + +# Bad variable names which should always be refused, separated by a comma +bad-names=foo,bar,baz,toto,tutu,tata + +# Regular expression which should only match function or class names that do +# not require a docstring. +no-docstring-rgx=.* + +# Minimum line length for functions/classes that require docstrings, shorter +# ones are exempt. +docstring-min-length=-1 + + +[FORMAT] + +# Maximum number of characters on a single line. +max-line-length=300 + +# Regexp for a line that is allowed to be longer than the limit. +ignore-long-lines=^\s*(# )??$ + +# Maximum number of lines in a module +max-module-lines=1000 + +# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1 +# tab). +indent-string=' ' + + +[MISCELLANEOUS] + +# List of note tags to take in consideration, separated by a comma. +notes=FIXME + + +[SIMILARITIES] + +# Minimum lines number of a similarity. +min-similarity-lines=4 + +# Ignore comments when computing similarities. +ignore-comments=yes + +# Ignore docstrings when computing similarities. +ignore-docstrings=yes + +# Ignore imports when computing similarities. +ignore-imports=no + + +[TYPECHECK] + +# Tells whether missing members accessed in mixin class should be ignored. A +# mixin class is detected if its name ends with "mixin" (case insensitive). +ignore-mixin-members=yes + +# List of classes names for which member attributes should not be checked +# (useful for classes with attributes dynamically set). +ignored-classes=SQLObject + +# When zope mode is activated, add a predefined set of Zope acquired attributes +# to generated-members. +zope=no + +# List of members which are set dynamically and missed by pylint inference +# system, and so shouldn't trigger E0201 when accessed. Python regular +# expressions are accepted. +generated-members=REQUEST,acl_users,aq_parent + + +[VARIABLES] + +# Tells whether we should check for unused import in __init__ files. +init-import=no + +# A regular expression matching the beginning of the name of dummy variables +# (i.e. not used). +dummy-variables-rgx=_$|dummy + +# List of additional names supposed to be defined in builtins. Remember that +# you should avoid to define new builtins when possible. +additional-builtins= + + +[CLASSES] + +# List of interface methods to ignore, separated by a comma. This is used for +# instance to not check methods defines in Zope's Interface base class. +ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by + +# List of method names used to declare (i.e. assign) instance attributes. +defining-attr-methods=__init__,__new__,setUp + +# List of valid names for the first argument in a class method. +valid-classmethod-first-arg=cls + +# List of valid names for the first argument in a metaclass class method. +valid-metaclass-classmethod-first-arg=mcs + + +[DESIGN] + +# Maximum number of arguments for function / method +max-args=5 + +# Argument names that match this expression will be ignored. Default to name +# with leading underscore +ignored-argument-names=_.* + +# Maximum number of locals for function / method body +max-locals=15 + +# Maximum number of return / yield for function / method body +max-returns=6 + +# Maximum number of branch for function / method body +max-branches=12 + +# Maximum number of statements in function / method body +max-statements=50 + +# Maximum number of parents for a class (see R0901). +max-parents=7 + +# Maximum number of attributes for a class (see R0902). +max-attributes=7 + +# Minimum number of public methods for a class (see R0903). +min-public-methods=2 + +# Maximum number of public methods for a class (see R0904). +max-public-methods=20 + + +[IMPORTS] + +# Deprecated modules which should not be used, separated by a comma +deprecated-modules=regsub,TERMIOS,Bastion,rexec + +# Create a graph of every (i.e. internal and external) dependencies in the +# given file (report RP0402 must not be disabled) +import-graph= + +# Create a graph of external dependencies in the given file (report RP0402 must +# not be disabled) +ext-import-graph= + +# Create a graph of internal dependencies in the given file (report RP0402 must +# not be disabled) +int-import-graph= + + +[EXCEPTIONS] + +# Exceptions that will emit a warning when being caught. Defaults to +# "Exception" +overgeneral-exceptions=Exception diff --git a/.travis.yml b/.travis.yml index daa38f2b..88fb87eb 100644 --- a/.travis.yml +++ b/.travis.yml @@ -6,8 +6,7 @@ addons: packages: - oracle-java8-installer install: - - pip install astroid==1.1.0 - - pip install pylint==1.1.0 + - pip install pylint==1.9.3 - pylint --version - | export ECLIPSE_TAR=$TRAVIS_BUILD_DIR/../eclipse.tar.gz diff --git a/mx.py b/mx.py index c6156fe0..cc91e8e9 100755 --- a/mx.py +++ b/mx.py @@ -420,7 +420,7 @@ def origin(self): for t in tokenize.generate_tokens(fp.readline): _, tval, (srow, _), _, _ = t if candidate is None: - if tval == '"' + self.name + '"' or tval == "'" + self.name + "'": + if tval in ('"' + self.name + '"', "'" + self.name + "'"): candidate = srow else: if tval == ':': @@ -681,7 +681,7 @@ def _resolveDepsHelper(self, deps, fatalIfMissing=True): by the strings. The 'deps' list is updated in place. """ if deps: - assert all((isinstance(d, str) or isinstance(d, Dependency) for d in deps)) + assert all((isinstance(d, (str, Dependency)) for d in deps)) if isinstance(deps[0], str): assert all((isinstance(d, str) for d in deps)) resolvedDeps = [] @@ -767,7 +767,7 @@ def classpath_repr(self, resolve=True): nyi('classpath_repr', self) def isJar(self): - cp_repr = self.classpath_repr() + cp_repr = self.classpath_repr() #pylint: disable=assignment-from-no-return if cp_repr: return cp_repr.endswith('.jar') or cp_repr.endswith('.JAR') or '.jar_' in cp_repr return True @@ -1660,7 +1660,7 @@ def overlay_check(arcname): a.__closing__() # accumulate services - services_versions = sorted([v for v in services.keys() if isinstance(v, int)]) + services_versions = sorted([v for v in services if isinstance(v, int)]) if services_versions: acummulated_services = {n: set(p) for n, p in services.items() if isinstance(n, basestring)} for v in services_versions: @@ -1791,7 +1791,7 @@ def getBuildTask(self, args): return JARArchiveTask(args, self) def exists(self): - return exists(self.path) and not self.sourcesPath or exists(self.sourcesPath) + return exists(self.path) and not self.sourcesPath or exists(self.sourcesPath) #pylint: disable=consider-using-ternary def remoteExtension(self): return 'jar' @@ -1865,7 +1865,7 @@ def __opened__(self, arc, srcArc, services): 'META-INF/CompilerHints': None, } - def __add__(self, arcname, contents): + def __add__(self, arcname, contents): #pylint: disable=unexpected-special-method-signature if arcname in self.meta_files.keys(): if self.meta_files[arcname] is None: self.meta_files[arcname] = contents @@ -2376,7 +2376,7 @@ def _install_source_files(files, include=None, excludes=None, optional=False): if_stripped = source.get('if_stripped') if if_stripped is not None and d.isJARDistribution(): if if_stripped not in ('include', 'exclude'): - abort("Could not understand `if_stripped` value ''. Valid values are 'include' and 'exclude'".format(if_stripped), context=self) + abort("Could not understand `if_stripped` value '{}'. Valid values are 'include' and 'exclude'".format(if_stripped), context=self) if (if_stripped == 'exclude' and d.is_stripped()) or (if_stripped == 'include' and not d.is_stripped()): return if source.get('path') is None: @@ -2413,8 +2413,8 @@ def _install_source_files(files, include=None, excludes=None, optional=False): " - or '{type}:{dependency}/path/to/file/in/archive' as a source (i.e., extracting /path/to/file/in/archive from '{dependency}' to '{dest}')".format( dest=destination, dependency=d.name, - type=source_type, - context=self)) + type=source_type), + context=self) unarchiver_dest_directory = dirname(unarchiver_dest_directory) ensure_dir_exists(unarchiver_dest_directory) ext = get_file_extension(source_archive_file) @@ -4033,7 +4033,7 @@ def __init__(self, jdk, jvmArgs, mainClass, toolJar, buildArgs=None): preexec_fn, creationflags = _get_new_progress_group_args() if _opts.verbose: log(' '.join(map(pipes.quote, args))) - p = subprocess.Popen(args, preexec_fn=preexec_fn, creationflags=creationflags, stdout=subprocess.PIPE) + p = subprocess.Popen(args, preexec_fn=preexec_fn, creationflags=creationflags, stdout=subprocess.PIPE) #pylint: disable=subprocess-popen-preexec-fn # scan stdout to capture the port number pout = [] @@ -4556,11 +4556,11 @@ def download_file_with_sha1(name, path, urls, sha1, sha1path, resolve, mustExist sha1Check = sha1 and sha1 != 'NOCHECK' canSymlink = canSymlink and not (get_os() == 'windows' or get_os() == 'cygwin') - if len(urls) is 0 and not sha1Check: + if len(urls) == 0 and not sha1Check: return path if not _check_file_with_sha1(path, sha1, sha1path, mustExist=resolve and mustExist): - if len(urls) is 0: + if len(urls) == 0: abort('SHA1 of {} ({}) does not match expected value ({})'.format(path, sha1OfFile(path), sha1)) if is_cache_path(path): @@ -5390,8 +5390,8 @@ def is_release_from_tags(self, vcdir, prefix): :return: True if release :rtype: bool """ - _release_version = self.release_version_from_tags(vcdir=vcdir, prefix=prefix) - return True if _release_version and re.match(r'^[0-9]+[0-9.]+$', _release_version) else False + _release_version = self.release_version_from_tags(vcdir=vcdir, prefix=prefix) #pylint: disable=assignment-from-no-return + return _release_version and re.match(r'^[0-9]+[0-9.]+$', _release_version) def release_version_from_tags(self, vcdir, prefix, snapshotSuffix='dev', abortOnError=True): """ @@ -6416,7 +6416,7 @@ def _fetch(self, vcdir, repository=None, refspec=None, abortOnError=True, prune= def _log_changes(self, vcdir, path=None, incoming=True, abortOnError=True): out = OutputCapture() cmd = ['git', 'log', '{0}origin/master{1}'.format( - '..', '' if incoming else '', '..')] + '..', '' if incoming else '..')] if path: cmd.extend(['--', path]) rc = self.run(cmd, nonZeroIsFatal=False, cwd=vcdir, out=out) @@ -8006,7 +8006,7 @@ def verify_imports(self, suites, args): args = [] results = [] # Ensure that all suites in the same repo import the same version of other suites - dirs = set([s.vc_dir for s in suites if s.dir != s.vc_dir]) + dirs = {s.vc_dir for s in suites if s.dir != s.vc_dir} for vc_dir in dirs: imports = {} for suite_dir in [_is_suite_dir(join(vc_dir, x)) for x in os.listdir(vc_dir) if _is_suite_dir(join(vc_dir, x))]: @@ -11104,7 +11104,7 @@ def waitOn(p): if get_os() == 'windows': # on windows use a poll loop, otherwise signal does not get handled retcode = None - while retcode == None: + while retcode is None: retcode = p.poll() time.sleep(0.05) else: @@ -11290,7 +11290,7 @@ def redirect(stream, f): stream.close() stdout = out if not callable(out) else subprocess.PIPE stderr = err if not callable(err) else subprocess.PIPE - p = subprocess.Popen(args, cwd=cwd, stdout=stdout, stderr=stderr, preexec_fn=preexec_fn, creationflags=creationflags, env=env, **kwargs) + p = subprocess.Popen(args, cwd=cwd, stdout=stdout, stderr=stderr, preexec_fn=preexec_fn, creationflags=creationflags, env=env, **kwargs) #pylint: disable=subprocess-popen-preexec-fn sub = _addSubprocess(p, args) joiners = [] if callable(out): @@ -11981,7 +11981,7 @@ def get_env(key, default=None): return value def logv(msg=None): - if vars(_opts).get('verbose') == None: + if vars(_opts).get('verbose') is None: def _deferrable(): logv(msg) _opts_parsed_deferrables.append(_deferrable) @@ -11991,7 +11991,7 @@ def _deferrable(): log(msg) def logvv(msg=None): - if vars(_opts).get('very_verbose') == None: + if vars(_opts).get('very_verbose') is None: def _deferrable(): logvv(msg) _opts_parsed_deferrables.append(_deferrable) @@ -12978,6 +12978,16 @@ def _processorjars_suite(s): build(['--dependencies', ",".join(names)]) return [ap.path for ap in apDists] +pylint_ver_map = { + (1, 1): { + 'rcfile': '.pylintrc11', + 'additional_options': [] + }, + (1, 9): { + 'rcfile': '.pylintrc19', + 'additional_options': ['--score=n'] + } +} @no_suite_loading def pylint(args): @@ -12988,11 +12998,7 @@ def pylint(args): parser.add_argument('--walk', action='store_true', help='use tree walk find .py files') parser.add_argument('--all', action='store_true', help='check all files, not just files in the mx.* directory.') args = parser.parse_args(args) - - rcfile = join(dirname(__file__), '.pylintrc') - if not exists(rcfile): - log_error('pylint configuration file does not exist: ' + rcfile) - return -1 + ver = (-1, -1) try: output = subprocess.check_output(['pylint', '--version'], stderr=subprocess.STDOUT) @@ -13001,13 +13007,22 @@ def pylint(args): log_error('could not determine pylint version from ' + output) return -1 major, minor, micro = (int(m.group(1)), int(m.group(2)), int(m.group(3))) - if major != 1 or minor != 1: - log_error('require pylint version = 1.1.x (got {0}.{1}.{2})'.format(major, minor, micro)) + log("Detected pylint version: {0}.{1}.{2}".format(major, minor, micro)) + ver = (major, minor) + if ver not in pylint_ver_map: + log_error('pylint version must be one of {3} (got {0}.{1}.{2})'.format(major, minor, micro, pylint_ver_map.keys())) return -1 except BaseException as e: log_error('pylint is not available: ' + str(e)) return -1 + rcfile = join(dirname(__file__), pylint_ver_map[ver]['rcfile']) + if not exists(rcfile): + log_error('pylint configuration file does not exist: ' + rcfile) + return -1 + + additional_options = pylint_ver_map[ver]['additional_options'] + def findfiles_by_walk(pyfiles): for suite in suites(True, includeBinary=False): if args.primary and not suite.primary: @@ -13061,7 +13076,7 @@ def findfiles_by_vc(pyfiles): for pyfile in pyfiles: log('Running pylint on ' + pyfile + '...') - run(['pylint', '--reports=n', '--rcfile=' + rcfile, pyfile], env=env) + run(['pylint', '--reports=n', '--rcfile=' + rcfile, pyfile] + additional_options, env=env) return 0 @@ -13079,7 +13094,7 @@ def __exit__(self, exc_type, exc_value, traceback): class TempDirCwd(TempDir): - def __init__(self, parent_dir=None): + def __init__(self, parent_dir=None): #pylint: disable=useless-super-delegation super(TempDirCwd, self).__init__(parent_dir) def __enter__(self): @@ -13369,7 +13384,7 @@ def canonicalizeprojects(args): if not pkg.startswith(p.name): p.abort('package in {0} does not have prefix matching project name: {1}'.format(p, pkg)) - ignoredDeps = set([d for d in p.deps if d.isJavaProject()]) + ignoredDeps = {d for d in p.deps if d.isJavaProject()} for pkg in p.imported_java_packages(): for dep in p.deps: if not dep.isJavaProject(): @@ -13466,7 +13481,7 @@ def newest(paths): def isOlderThan(self, arg): if not self.timestamp: return True - if isinstance(arg, types.IntType) or isinstance(arg, types.LongType) or isinstance(arg, types.FloatType): + if isinstance(arg, (types.IntType, types.LongType, types.FloatType)): return self.timestamp < arg if isinstance(arg, TimeStampFile): if arg.timestamp is None: @@ -13485,7 +13500,7 @@ def isOlderThan(self, arg): def isNewerThan(self, arg): if not self.timestamp: return False - if isinstance(arg, types.IntType) or isinstance(arg, types.LongType) or isinstance(arg, types.FloatType): + if isinstance(arg, (types.IntType, types.LongType, types.FloatType)): return self.timestamp > arg if isinstance(arg, TimeStampFile): if arg.timestamp is None: @@ -13701,7 +13716,7 @@ def on_error(func, _path, exc_info): os.unlink(_path) else: def on_error(*args): - raise + raise #pylint: disable=misplaced-bare-raise if isdir(path): shutil.rmtree(path, onerror=on_error) else: @@ -14619,7 +14634,7 @@ def _eclipseinit_suite(s, buildProcessorJars=True, refreshOnly=False, logToConso dist.dir = projectDir builders = _genEclipseBuilder(out, dist, 'Create' + dist.name + 'Dist', '-v archive @' + dist.name, relevantResources=relevantResources, - logToFile=True, refresh=True, async=False, + logToFile=True, refresh=True, isAsync=False, logToConsole=logToConsole, appendToLogFile=False, refreshFile='/{0}/{1}'.format(dist.name, basename(dist.path))) files = files + builders @@ -14704,7 +14719,7 @@ def _zip_files(files, baseDir, zipPath): IRESOURCE_FILE = 1 IRESOURCE_FOLDER = 2 -def _genEclipseBuilder(dotProjectDoc, p, name, mxCommand, refresh=True, refreshFile=None, relevantResources=None, async=False, logToConsole=False, logToFile=False, appendToLogFile=True, xmlIndent='\t', xmlStandalone=None): +def _genEclipseBuilder(dotProjectDoc, p, name, mxCommand, refresh=True, refreshFile=None, relevantResources=None, isAsync=False, logToConsole=False, logToFile=False, appendToLogFile=True, xmlIndent='\t', xmlStandalone=None): externalToolDir = join(p.dir, '.externalToolBuilders') launchOut = XMLDoc() consoleOn = 'true' if logToConsole else 'false' @@ -14733,7 +14748,7 @@ def _genEclipseBuilder(dotProjectDoc, p, name, mxCommand, refresh=True, refreshF launchOut.element('stringAttribute', {'key' : 'org.eclipse.ui.externaltools.ATTR_BUILD_SCOPE', 'value': resources}) launchOut.element('booleanAttribute', {'key' : 'org.eclipse.debug.ui.ATTR_CONSOLE_OUTPUT_ON', 'value': consoleOn}) - launchOut.element('booleanAttribute', {'key' : 'org.eclipse.debug.ui.ATTR_LAUNCH_IN_BACKGROUND', 'value': 'true' if async else 'false'}) + launchOut.element('booleanAttribute', {'key' : 'org.eclipse.debug.ui.ATTR_LAUNCH_IN_BACKGROUND', 'value': 'true' if isAsync else 'false'}) if logToFile: logFile = join(externalToolDir, name + '.log') launchOut.element('stringAttribute', {'key' : 'org.eclipse.debug.ui.ATTR_CAPTURE_IN_FILE', 'value': logFile}) @@ -15181,7 +15196,7 @@ def processDep(dep, edge): out.element('target', data='jar') out.element('clean-target', data='clean') out.element('id', data='jar') - out.close('reference') + out.close('reference') #pylint: disable=too-many-function-args p.walk_deps(visit=processDep, ignoredEdges=[DEP_EXCLUDED]) if firstDep: @@ -15770,7 +15785,7 @@ def processDep(dep, edge): if jdk.javaCompliance < dep.jdkStandardizedSince: moduleXml.element('orderEntry', attributes={'type': 'library', 'name': dep.name, 'level': 'project'}) else: - logv("{} skipping {} for {}".format(p, dep, jdk)) + logv("{} skipping {} for {}".format(p, dep, jdk)) #pylint: disable=undefined-loop-variable elif dep.isJreLibrary(): pass elif dep.isTARDistribution() or dep.isNativeProject() or dep.isArchivableProject() or dep.isResourceLibrary(): @@ -16151,7 +16166,7 @@ def artifactFileName(dist): update_file(metaAntFile, metaAntXml.xml(indent=' ', newl='\n')) # 3) Make an artifact for every distribution - validArtifactNames = set([artifactFileName(dist) for dist in validDistributions]) + validArtifactNames = {artifactFileName(dist) for dist in validDistributions} artifactsDir = join(ideaProjectDirectory, 'artifacts') ensure_dir_exists(artifactsDir) for fileName in os.listdir(artifactsDir): @@ -16197,7 +16212,7 @@ def intellij_scm_name(vc_kind): suites_for_vcs = suites() + ([_mx_suite] if mx_python_modules else []) sourceSuitesWithVCS = [vc_suite for vc_suite in suites_for_vcs if vc_suite.isSourceSuite() and vc_suite.vc is not None] - uniqueSuitesVCS = set([(vc_suite.vc_dir, vc_suite.vc.kind) for vc_suite in sourceSuitesWithVCS]) + uniqueSuitesVCS = {(vc_suite.vc_dir, vc_suite.vc.kind) for vc_suite in sourceSuitesWithVCS} for vcs_dir, kind in uniqueSuitesVCS: vcsXml.element('mapping', attributes={'directory': vcs_dir, 'vcs': intellij_scm_name(kind)}) @@ -16377,7 +16392,7 @@ def verifysourceinproject(args): suiteWhitelists = {} def ignorePath(path, whitelist): - if whitelist == None: + if whitelist is None: return True for entry in whitelist: if fnmatch.fnmatch(path, entry): @@ -16389,7 +16404,7 @@ def ignorePath(path, whitelist): distIdeDirs = [d.get_ide_project_dir() for d in suite.dists if d.isJARDistribution() and d.get_ide_project_dir() is not None] suiteDirs.add(suite.dir) # all suites in the same repository must have the same setting for requiresSourceInProjects - if suiteVcDirs.get(suite.vc_dir) == None: + if suiteVcDirs.get(suite.vc_dir) is None: suiteVcDirs[suite.vc_dir] = suite.vc whitelistFile = join(suite.vc_dir, '.nonprojectsources') if exists(whitelistFile): @@ -16463,7 +16478,7 @@ def ignorePath(path, whitelist): for vc_dir, sources in unmanagedSources.iteritems(): for source in sources: log(source) - if suiteWhitelists.get(vc_dir) != None: + if suiteWhitelists.get(vc_dir) is not None: retcode += 1 log('Since {} has a .nonprojectsources file, all Java source files must be \n'\ 'part of a project in a suite or the files must be listed in the .nonprojectsources.'.format(vc_dir)) @@ -16878,14 +16893,14 @@ def site(args): if args.dot_output_base is not None: dotErr = None try: - if not 'version' in subprocess.check_output(['dot', '-V'], stderr=subprocess.STDOUT): + if 'version' not in subprocess.check_output(['dot', '-V'], stderr=subprocess.STDOUT): dotErr = 'dot -V does not print a string containing "version"' except subprocess.CalledProcessError as e: dotErr = 'error calling "dot -V": {0}'.format(e) except OSError as e: dotErr = 'error calling "dot -V": {0}'.format(e) - if dotErr != None: + if dotErr is not None: abort('cannot generate dependency graph: ' + dotErr) dot = join(tmpbase, 'all', str(args.dot_output_base) + '.dot') @@ -17268,7 +17283,7 @@ def _sversions(s, suite_import): if s.dir in visited: return visited.add(s.dir) - if s.vc == None: + if s.vc is None: print('No version control info for suite ' + s.name) else: print(_sversions_rev(s.vc.parent(s.vc_dir), s.vc.isDirty(s.vc_dir), with_color) + ' ' + s.name + ' ' + s.vc_dir) @@ -17794,7 +17809,7 @@ def maven_install(args): for dist in s.dists: # ignore non-exported dists if not dist.internal and not dist.name.startswith('COM_ORACLE') and hasattr(dist, 'maven') and dist.maven: - if len(only) is 0 or dist.name in only: + if len(only) == 0 or dist.name in only: arcdists.append(dist) mxMetaName = _mx_binary_distribution_root(s.name) @@ -17920,7 +17935,7 @@ def show_version(args): args = parser.parse_args(args) if args.oneline: vc = VC.get_vc(_mx_home, abortOnError=False) - if vc == None: + if vc is None: print('No version control info for mx %s' % version) else: print(_sversions_rev(vc.parent(_mx_home), vc.isDirty(_mx_home), False) + ' mx %s' % version) @@ -18536,7 +18551,7 @@ def _should_ignore_conflict_edge(_imported_suite, _importer_name): _log_discovery("Ignoring {} -> {} because of version_from({}) = {} (fast-path)".format(_importer_name, _imported_suite.name, suite_with_from, from_suite)) return True if from_suite not in ancestor_names: - _log_discovery("Temporarily ignoring {} -> {} because of version_from({}) = {} ({} is not yet discovered)".format(_importer_name, _imported_suite.name, suite_with_from, from_suite, from_suite)) + _log_discovery("Temporarily ignoring {} -> {} because of version_from({}) = {f_suite} ({f_suite} is not yet discovered)".format(_importer_name, _imported_suite.name, suite_with_from, f_suite=from_suite)) return True vc_from_suite_and_ancestors = {from_suite} vc_from_suite_and_ancestors |= vc_suites & ancestor_names[from_suite] @@ -18828,7 +18843,7 @@ def _setup_binary_suites(): if not is_optional_suite_context: abort('no primary suite found for %s' % initial_command) - for envVar in _loadedEnv.keys(): + for envVar in _loadedEnv: value = _loadedEnv[envVar] if os.environ.get(envVar) != value: logv('Setting environment variable %s=%s' % (envVar, value)) diff --git a/mx_benchmark.py b/mx_benchmark.py index cee4d6ed..a1c16ee4 100644 --- a/mx_benchmark.py +++ b/mx_benchmark.py @@ -498,7 +498,7 @@ def var(name): inst = vtype(v) else: raise RuntimeError("Cannot handle object '{0}' of expected type {1}".format(v, vtype)) - if type(inst) not in [str, int, long, float, bool]: + if not isinstance(inst, (str, int, long, float, bool)): raise RuntimeError("Object '{0}' has unknown type: {1}".format(inst, type(inst))) datapoint[key] = inst datapoints.append(datapoint) @@ -766,7 +766,7 @@ def repairDatapointsAndFail(self, benchmarks, bmSuiteArgs, partialResults, messa raise BenchmarkFailureError(message, partialResults) def validateStdoutWithDimensions( - self, out, benchmarks, bmSuiteArgs, retcode=None, dims=None, extraRules=None, *args, **kwargs): + self, out, benchmarks, bmSuiteArgs, retcode=None, dims=None, extraRules=None): """Validate out against the parse rules and create data points. The dimensions from the `dims` dict are added to each datapoint parsed from the @@ -780,7 +780,7 @@ def validateStdoutWithDimensions( extraRules = [] def compiled(pat): - if type(pat) is str: + if isinstance(pat, str): return re.compile(pat) return pat @@ -829,7 +829,7 @@ def compiled(pat): return datapoints def validateReturnCode(self, retcode): - return retcode is 0 + return retcode == 0 def flakySuccessPatterns(self): """List of regex pattern that can override matched failure and success patterns. @@ -935,7 +935,7 @@ def getExtraIterationCount(self, iterations): def addAverageAcrossLatestResults(self, results): # Postprocess results to compute the resulting time by taking the average of last N runs, # where N is 20% of the maximum number of iterations, at least 5 and at most 10. - benchmarkNames = set([r["benchmark"] for r in results]) + benchmarkNames = {r["benchmark"] for r in results} for benchmark in benchmarkNames: warmupResults = [result for result in results if result["metric.name"] == "warmup" and result["benchmark"] == benchmark] if warmupResults: @@ -944,7 +944,7 @@ def addAverageAcrossLatestResults(self, results): warmupResultsToAverage = [result for result in warmupResults if result["metric.iteration"] >= lastIteration - resultIterations + 1] - if len(set([result["metric.iteration"] for result in warmupResults])) != len(warmupResults): + if len({result["metric.iteration"] for result in warmupResults}) != len(warmupResults): mx.warn("Inconsistent number of iterations ! Duplicate iteration number found.") mx.warn("Iteration results : {}".format(warmupResults)) @@ -1040,7 +1040,7 @@ def guest_vm_config_name(self, host_vm, vm): return vm.config_name() if host_vm else "default" def validateStdoutWithDimensions( - self, out, benchmarks, bmSuiteArgs, retcode=None, dims=None, extraRules=None, *args, **kwargs): + self, out, benchmarks, bmSuiteArgs, retcode=None, dims=None, extraRules=None): if extraRules is None: extraRules = [] @@ -1595,7 +1595,7 @@ def machineRam(self): def branch(self): mxsuite = mx.primary_suite() - name = mxsuite.vc and mxsuite.vc.active_branch(mxsuite.dir, abortOnError=False) or "" + name = mxsuite.vc.active_branch(mxsuite.dir, abortOnError=False) if mxsuite.vc else '' return name def buildUrl(self): @@ -1702,7 +1702,7 @@ def getSuiteAndBenchNames(self, args, bmSuiteArgs): suite = _bm_suites.get(suitename) if not suite: mx.abort("Cannot find benchmark suite '{0}'. Available suites are: {1}".format(suitename, ' '.join(bm_suite_valid_keys()))) - if benchspec is "*": + if benchspec == "*": return (suite, [[b] for b in suite.benchmarkList(bmSuiteArgs)]) elif benchspec.startswith("*[") and benchspec.endswith("]"): all_benchmarks = suite.benchmarkList(bmSuiteArgs) @@ -1712,7 +1712,7 @@ def getSuiteAndBenchNames(self, args, bmSuiteArgs): plural = "" if len(difference) == 1 else "s" mx.abort("The benchmark{0} {1} are not supported by the suite ".format(plural, ",".join(difference))) return (suite, [[b] for b in all_benchmarks if b in requested_benchmarks]) - elif benchspec is "": + elif benchspec == "": return (suite, [None]) else: benchspec = benchspec.split(",") @@ -1733,7 +1733,7 @@ def applyScoreFunction(self, datapoint): if "metric.value" in datapoint: metric_value = datapoint["metric.value"] # Apply the score function to the metric value. - if function is "id": + if function == "id": datapoint["metric.score-value"] = metric_value elif function.startswith("multiply(") and function.endswith(")"): factor = function[len("multiply("):-1] diff --git a/mx_benchplot.py b/mx_benchplot.py index da731711..e1814ac4 100644 --- a/mx_benchplot.py +++ b/mx_benchplot.py @@ -345,7 +345,7 @@ def extract_results(files, names, last_n=None, selected_benchmarks=None): benchmark = entry['benchmark'] if benchmark not in benchmarks: benchmarks.append(benchmark) - if bench_suite == None: + if bench_suite is None: bench_suite = entry['bench-suite'] else: if bench_suite != entry['bench-suite']: diff --git a/mx_gate.py b/mx_gate.py index b0489272..ac00a7df 100644 --- a/mx_gate.py +++ b/mx_gate.py @@ -334,7 +334,7 @@ def gate(args): if args.partial: partialArgs = args.partial.split('/') - if len(partialArgs) is not 2: + if len(partialArgs) != 2: mx.abort('invalid partial argument specified') selected = int(partialArgs[0]) - 1 @@ -358,7 +358,7 @@ def gate(args): Task.filtersExclude = False mx.log('Running gate with partial tasks ' + args.partial + ". " + str(len(partialTasks)) + " out of " + str(len(nonBuildTasks)) + " non-build tasks selected.") - if len(partialTasks) is 0: + if len(partialTasks) == 0: mx.log('No partial tasks left to run. Finishing gate early.') return @@ -369,13 +369,13 @@ def gate(args): def shell_quoted_args(args): args_string = ' '.join([pipes.quote(str(arg)) for arg in args]) - if args_string is not '': + if args_string != '': args_string = ' ' + args_string return args_string def mx_command_entered(command, *args, **kwargs): global _command_level - if _command_level is 0: + if _command_level == 0: all_commands.append((command.command, args, kwargs)) mx.log(mx.colorize('Running: ' + command_in_gate_message(command.command, args, kwargs), color='blue')) _command_level = _command_level + 1 diff --git a/tests/mx.mxtests/mx_mxtests.py b/tests/mx.mxtests/mx_mxtests.py index a8df491e..f94044fe 100644 --- a/tests/mx.mxtests/mx_mxtests.py +++ b/tests/mx.mxtests/mx_mxtests.py @@ -160,4 +160,3 @@ def _command_info(args): "mxt-vc-locate" : [_vc_locate, '[options]'], 'mxt-command-info' : [_command_info, '[options]'], }) -