Todo: 集成多平台 解决因SaiNiu线程抢占资源问题 本地提交测试环境打包 和 正式打包脚本与正式环境打包bat 提交Python32环境包 改进多日志文件生成情况修改打包日志细节
This commit is contained in:
@@ -0,0 +1 @@
|
||||
#
|
||||
@@ -0,0 +1 @@
|
||||
#
|
||||
@@ -0,0 +1,268 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
Viewer for PyInstaller-generated archives.
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import sys
|
||||
|
||||
import PyInstaller.log
|
||||
from PyInstaller.archive.readers import CArchiveReader, ZlibArchiveReader
|
||||
|
||||
try:
|
||||
from argcomplete import autocomplete
|
||||
except ImportError:
|
||||
|
||||
def autocomplete(parser):
|
||||
return None
|
||||
|
||||
|
||||
class ArchiveViewer:
|
||||
def __init__(self, filename, interactive_mode, recursive_mode, brief_mode):
|
||||
self.filename = filename
|
||||
self.interactive_mode = interactive_mode
|
||||
self.recursive_mode = recursive_mode
|
||||
self.brief_mode = brief_mode
|
||||
|
||||
self.stack = []
|
||||
|
||||
# Recursive mode implies non-interactive mode
|
||||
if self.recursive_mode:
|
||||
self.interactive_mode = False
|
||||
|
||||
def main(self):
|
||||
# Open top-level (initial) archive
|
||||
archive = self._open_toplevel_archive(self.filename)
|
||||
archive_name = os.path.basename(self.filename)
|
||||
self.stack.append((archive_name, archive))
|
||||
|
||||
# Not-interactive mode
|
||||
if not self.interactive_mode:
|
||||
return self._non_interactive_processing()
|
||||
|
||||
# Interactive mode; show top-level archive
|
||||
self._show_archive_contents(archive_name, archive)
|
||||
|
||||
# Interactive command processing
|
||||
while True:
|
||||
# Read command
|
||||
try:
|
||||
tokens = input('? ').split(None, 1)
|
||||
except EOFError:
|
||||
# Ctrl-D
|
||||
print(file=sys.stderr) # Clear line.
|
||||
break
|
||||
|
||||
# Print usage?
|
||||
if not tokens:
|
||||
self._print_usage()
|
||||
continue
|
||||
|
||||
# Process
|
||||
command = tokens[0].upper()
|
||||
if command == 'Q':
|
||||
break
|
||||
elif command == 'U':
|
||||
self._move_up_the_stack()
|
||||
elif command == 'O':
|
||||
self._open_embedded_archive(*tokens[1:])
|
||||
elif command == 'X':
|
||||
self._extract_file(*tokens[1:])
|
||||
elif command == 'S':
|
||||
archive_name, archive = self.stack[-1]
|
||||
self._show_archive_contents(archive_name, archive)
|
||||
else:
|
||||
self._print_usage()
|
||||
|
||||
def _non_interactive_processing(self):
|
||||
archive_count = 0
|
||||
|
||||
while self.stack:
|
||||
archive_name, archive = self.stack.pop()
|
||||
archive_count += 1
|
||||
|
||||
if archive_count > 1:
|
||||
print("")
|
||||
self._show_archive_contents(archive_name, archive)
|
||||
|
||||
if not self.recursive_mode:
|
||||
continue
|
||||
|
||||
# Scan for embedded archives
|
||||
if isinstance(archive, CArchiveReader):
|
||||
for name, (*_, typecode) in archive.toc.items():
|
||||
if typecode == 'z':
|
||||
try:
|
||||
embedded_archive = archive.open_embedded_archive(name)
|
||||
except Exception as e:
|
||||
print(f"Could not open embedded archive {name!r}: {e}", file=sys.stderr)
|
||||
self.stack.append((name, embedded_archive))
|
||||
|
||||
def _print_usage(self):
|
||||
print("U: go up one level", file=sys.stderr)
|
||||
print("O <name>: open embedded archive with given name", file=sys.stderr)
|
||||
print("X <name>: extract file with given name", file=sys.stderr)
|
||||
print("S: list the contents of current archive again", file=sys.stderr)
|
||||
print("Q: quit", file=sys.stderr)
|
||||
|
||||
def _move_up_the_stack(self):
|
||||
if len(self.stack) > 1:
|
||||
self.stack.pop()
|
||||
archive_name, archive = self.stack[-1]
|
||||
self._show_archive_contents(archive_name, archive)
|
||||
else:
|
||||
print("Already in the top archive!", file=sys.stderr)
|
||||
|
||||
def _open_toplevel_archive(self, filename):
|
||||
if not os.path.isfile(filename):
|
||||
print(f"Archive {filename} does not exist!", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
if filename[-4:].lower() == '.pyz':
|
||||
return ZlibArchiveReader(filename)
|
||||
return CArchiveReader(filename)
|
||||
|
||||
def _open_embedded_archive(self, archive_name=None):
|
||||
# Ask for name if not provided
|
||||
if not archive_name:
|
||||
archive_name = input('Open name? ')
|
||||
archive_name = archive_name.strip()
|
||||
|
||||
# No name given; abort
|
||||
if not archive_name:
|
||||
return
|
||||
|
||||
# Open the embedded archive
|
||||
_, parent_archive = self.stack[-1]
|
||||
|
||||
if not hasattr(parent_archive, 'open_embedded_archive'):
|
||||
print("Archive does not support embedded archives!", file=sys.stderr)
|
||||
return
|
||||
|
||||
try:
|
||||
archive = parent_archive.open_embedded_archive(archive_name)
|
||||
except Exception as e:
|
||||
print(f"Could not open embedded archive {archive_name!r}: {e}", file=sys.stderr)
|
||||
return
|
||||
|
||||
# Add to stack and display contents
|
||||
self.stack.append((archive_name, archive))
|
||||
self._show_archive_contents(archive_name, archive)
|
||||
|
||||
def _extract_file(self, name=None):
|
||||
# Ask for name if not provided
|
||||
if not name:
|
||||
name = input('Extract name? ')
|
||||
name = name.strip()
|
||||
|
||||
# Archive
|
||||
archive_name, archive = self.stack[-1]
|
||||
|
||||
# Retrieve data
|
||||
try:
|
||||
if isinstance(archive, CArchiveReader):
|
||||
data = archive.extract(name)
|
||||
elif isinstance(archive, ZlibArchiveReader):
|
||||
data = archive.extract(name, raw=True)
|
||||
else:
|
||||
raise NotImplementedError(f"Extraction from archive type {type(archive)} not implemented!")
|
||||
except Exception as e:
|
||||
print(f"Failed to extract data for entry {name!r} from {archive_name!r}: {e}", file=sys.stderr)
|
||||
|
||||
# Write to file
|
||||
filename = input('Output filename? ')
|
||||
if not filename:
|
||||
print(repr(data))
|
||||
else:
|
||||
with open(filename, 'wb') as fp:
|
||||
fp.write(data)
|
||||
|
||||
def _show_archive_contents(self, archive_name, archive):
|
||||
if isinstance(archive, CArchiveReader):
|
||||
if archive.options:
|
||||
print(f"Options in {archive_name!r} (PKG/CArchive):")
|
||||
for option in archive.options:
|
||||
print(f" {option}")
|
||||
print(f"Contents of {archive_name!r} (PKG/CArchive):")
|
||||
if self.brief_mode:
|
||||
for name in archive.toc.keys():
|
||||
print(f" {name}")
|
||||
else:
|
||||
print(" position, length, uncompressed_length, is_compressed, typecode, name")
|
||||
for name, (position, length, uncompressed_length, is_compressed, typecode) in archive.toc.items():
|
||||
print(f" {position}, {length}, {uncompressed_length}, {is_compressed}, {typecode!r}, {name!r}")
|
||||
elif isinstance(archive, ZlibArchiveReader):
|
||||
print(f"Contents of {archive_name!r} (PYZ):")
|
||||
if self.brief_mode:
|
||||
for name in archive.toc.keys():
|
||||
print(f" {name}")
|
||||
else:
|
||||
print(" typecode, position, length, name")
|
||||
for name, (typecode, position, length) in archive.toc.items():
|
||||
print(f" {typecode}, {position}, {length}, {name!r}")
|
||||
else:
|
||||
print(f"Contents of {name} (unknown)")
|
||||
print(f"FIXME: implement content listing for archive type {type(archive)}!")
|
||||
|
||||
|
||||
def run():
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
'-l',
|
||||
'--list',
|
||||
default=False,
|
||||
action='store_true',
|
||||
dest='listing_mode',
|
||||
help='List the archive contents and exit (default: %(default)s).',
|
||||
)
|
||||
parser.add_argument(
|
||||
'-r',
|
||||
'--recursive',
|
||||
default=False,
|
||||
action='store_true',
|
||||
dest='recursive',
|
||||
help='Recursively print an archive log (default: %(default)s). Implies --list.',
|
||||
)
|
||||
parser.add_argument(
|
||||
'-b',
|
||||
'--brief',
|
||||
default=False,
|
||||
action='store_true',
|
||||
dest='brief',
|
||||
help='When displaying archive contents, show only file names. (default: %(default)s).',
|
||||
)
|
||||
PyInstaller.log.__add_options(parser)
|
||||
parser.add_argument(
|
||||
'filename',
|
||||
metavar='pyi_archive',
|
||||
help="PyInstaller archive to process.",
|
||||
)
|
||||
|
||||
autocomplete(parser)
|
||||
args = parser.parse_args()
|
||||
PyInstaller.log.__process_options(parser, args)
|
||||
|
||||
try:
|
||||
viewer = ArchiveViewer(
|
||||
filename=args.filename,
|
||||
interactive_mode=not args.listing_mode,
|
||||
recursive_mode=args.recursive,
|
||||
brief_mode=args.brief,
|
||||
)
|
||||
viewer.main()
|
||||
except KeyboardInterrupt:
|
||||
raise SystemExit("Aborted by user.")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
run()
|
||||
@@ -0,0 +1,58 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
Show dll dependencies of executable files or other dynamic libraries.
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import glob
|
||||
|
||||
import PyInstaller.depend.bindepend
|
||||
import PyInstaller.log
|
||||
|
||||
try:
|
||||
from argcomplete import autocomplete
|
||||
except ImportError:
|
||||
|
||||
def autocomplete(parser):
|
||||
return None
|
||||
|
||||
|
||||
def run():
|
||||
parser = argparse.ArgumentParser()
|
||||
PyInstaller.log.__add_options(parser)
|
||||
parser.add_argument(
|
||||
'filenames',
|
||||
nargs='+',
|
||||
metavar='executable-or-dynamic-library',
|
||||
help="executables or dynamic libraries for which the dependencies should be shown",
|
||||
)
|
||||
|
||||
autocomplete(parser)
|
||||
args = parser.parse_args()
|
||||
PyInstaller.log.__process_options(parser, args)
|
||||
|
||||
# Suppress all informative messages from the dependency code.
|
||||
PyInstaller.log.getLogger('PyInstaller.build.bindepend').setLevel(PyInstaller.log.WARN)
|
||||
|
||||
try:
|
||||
for input_filename_or_pattern in args.filenames:
|
||||
for filename in glob.glob(input_filename_or_pattern):
|
||||
print(f"{filename}:")
|
||||
for lib_name, lib_path in sorted(PyInstaller.depend.bindepend.get_imports(filename)):
|
||||
print(f" {lib_name} => {lib_path}")
|
||||
print("")
|
||||
except KeyboardInterrupt:
|
||||
raise SystemExit("Aborted by user request.")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
run()
|
||||
@@ -0,0 +1,59 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
|
||||
import argparse
|
||||
import codecs
|
||||
|
||||
try:
|
||||
from argcomplete import autocomplete
|
||||
except ImportError:
|
||||
|
||||
def autocomplete(parser):
|
||||
return None
|
||||
|
||||
|
||||
def run():
|
||||
parser = argparse.ArgumentParser(
|
||||
epilog=(
|
||||
'The printed output may be saved to a file, edited and used as the input for a version resource on any of '
|
||||
'the executable targets in a PyInstaller .spec file.'
|
||||
)
|
||||
)
|
||||
parser.add_argument(
|
||||
'exe_file',
|
||||
metavar='exe-file',
|
||||
help="full pathname of a Windows executable",
|
||||
)
|
||||
parser.add_argument(
|
||||
'out_filename',
|
||||
metavar='out-filename',
|
||||
nargs='?',
|
||||
default='file_version_info.txt',
|
||||
help="filename where the grabbed version info will be saved",
|
||||
)
|
||||
|
||||
autocomplete(parser)
|
||||
args = parser.parse_args()
|
||||
|
||||
try:
|
||||
from PyInstaller.utils.win32 import versioninfo
|
||||
info = versioninfo.read_version_info_from_executable(args.exe_file)
|
||||
if not info:
|
||||
raise SystemExit("ERROR: VersionInfo resource not found in exe")
|
||||
with codecs.open(args.out_filename, 'w', 'utf-8') as fp:
|
||||
fp.write(str(info))
|
||||
print(f"Version info written to: {args.out_filename!r}")
|
||||
except KeyboardInterrupt:
|
||||
raise SystemExit("Aborted by user request.")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
run()
|
||||
@@ -0,0 +1,61 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
Automatically build a spec file containing the description of the project.
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import os
|
||||
|
||||
import PyInstaller.building.makespec
|
||||
import PyInstaller.log
|
||||
|
||||
try:
|
||||
from argcomplete import autocomplete
|
||||
except ImportError:
|
||||
|
||||
def autocomplete(parser):
|
||||
return None
|
||||
|
||||
|
||||
def generate_parser():
|
||||
p = argparse.ArgumentParser()
|
||||
PyInstaller.building.makespec.__add_options(p)
|
||||
PyInstaller.log.__add_options(p)
|
||||
p.add_argument(
|
||||
'scriptname',
|
||||
nargs='+',
|
||||
)
|
||||
return p
|
||||
|
||||
|
||||
def run():
|
||||
p = generate_parser()
|
||||
autocomplete(p)
|
||||
args = p.parse_args()
|
||||
PyInstaller.log.__process_options(p, args)
|
||||
|
||||
# Split pathex by using the path separator.
|
||||
temppaths = args.pathex[:]
|
||||
args.pathex = []
|
||||
for p in temppaths:
|
||||
args.pathex.extend(p.split(os.pathsep))
|
||||
|
||||
try:
|
||||
name = PyInstaller.building.makespec.main(args.scriptname, **vars(args))
|
||||
print('Wrote %s.' % name)
|
||||
print('Now run pyinstaller.py to build the executable.')
|
||||
except KeyboardInterrupt:
|
||||
raise SystemExit("Aborted by user request.")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
run()
|
||||
@@ -0,0 +1,51 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
|
||||
import argparse
|
||||
import os
|
||||
|
||||
try:
|
||||
from argcomplete import autocomplete
|
||||
except ImportError:
|
||||
|
||||
def autocomplete(parser):
|
||||
return None
|
||||
|
||||
|
||||
def run():
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
'info_file',
|
||||
metavar='info-file',
|
||||
help="text file containing version info",
|
||||
)
|
||||
parser.add_argument(
|
||||
'exe_file',
|
||||
metavar='exe-file',
|
||||
help="full pathname of a Windows executable",
|
||||
)
|
||||
autocomplete(parser)
|
||||
args = parser.parse_args()
|
||||
|
||||
info_file = os.path.abspath(args.info_file)
|
||||
exe_file = os.path.abspath(args.exe_file)
|
||||
|
||||
try:
|
||||
from PyInstaller.utils.win32 import versioninfo
|
||||
info = versioninfo.load_version_info_from_text_file(info_file)
|
||||
versioninfo.write_version_info_to_executable(exe_file, info)
|
||||
print(f"Version info written to: {exe_file!r}")
|
||||
except KeyboardInterrupt:
|
||||
raise SystemExit("Aborted by user request.")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
run()
|
||||
@@ -0,0 +1,573 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2005-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
|
||||
import contextlib
|
||||
import copy
|
||||
import glob
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
# Set a handler for the root-logger to inhibit 'basicConfig()' (called in PyInstaller.log) is setting up a stream
|
||||
# handler writing to stderr. This avoids log messages to be written (and captured) twice: once on stderr and
|
||||
# once by pytests's caplog.
|
||||
logging.getLogger().addHandler(logging.NullHandler())
|
||||
|
||||
# psutil is used for process tree clean-up on time-out when running the test frozen application. If unavailable
|
||||
# (for example, on cygwin), we fall back to trying to terminate only the main application process.
|
||||
try:
|
||||
import psutil # noqa: E402
|
||||
except ModuleNotFoundError:
|
||||
psutil = None
|
||||
|
||||
import pytest # noqa: E402
|
||||
|
||||
from PyInstaller import __main__ as pyi_main # noqa: E402
|
||||
from PyInstaller import configure # noqa: E402
|
||||
from PyInstaller.compat import is_cygwin, is_darwin, is_win # noqa: E402
|
||||
from PyInstaller.depend.analysis import initialize_modgraph # noqa: E402
|
||||
from PyInstaller.archive.readers import pkg_archive_contents # noqa: E402
|
||||
from PyInstaller.utils.tests import gen_sourcefile # noqa: E402
|
||||
from PyInstaller.utils.win32 import winutils # noqa: E402
|
||||
|
||||
# Timeout for running the executable. If executable does not exit in this time, it is interpreted as a test failure.
|
||||
_EXE_TIMEOUT = 3 * 60 # In sec.
|
||||
# All currently supported platforms
|
||||
SUPPORTED_OSES = {"darwin", "linux", "win32"}
|
||||
# Have pyi_builder fixure clean-up the temporary directories of successful tests. Controlled by environment variable.
|
||||
_PYI_BUILDER_CLEANUP = os.environ.get("PYI_BUILDER_CLEANUP", "1") == "1"
|
||||
|
||||
# Fixtures
|
||||
# --------
|
||||
|
||||
|
||||
def pytest_runtest_setup(item):
|
||||
"""
|
||||
Markers to skip tests based on the current platform.
|
||||
https://pytest.org/en/stable/example/markers.html#marking-platform-specific-tests-with-pytest
|
||||
|
||||
Available markers: see setup.cfg [tool:pytest] markers
|
||||
- @pytest.mark.darwin (macOS)
|
||||
- @pytest.mark.linux (GNU/Linux)
|
||||
- @pytest.mark.win32 (Windows)
|
||||
"""
|
||||
supported_platforms = SUPPORTED_OSES.intersection(mark.name for mark in item.iter_markers())
|
||||
plat = sys.platform
|
||||
if supported_platforms and plat not in supported_platforms:
|
||||
pytest.skip(f"does not run on {plat}")
|
||||
|
||||
|
||||
@pytest.hookimpl(tryfirst=True, hookwrapper=True)
|
||||
def pytest_runtest_makereport(item, call):
|
||||
# Execute all other hooks to obtain the report object.
|
||||
outcome = yield
|
||||
rep = outcome.get_result()
|
||||
|
||||
# Set a report attribute for each phase of a call, which can be "setup", "call", "teardown".
|
||||
setattr(item, f"rep_{rep.when}", rep)
|
||||
|
||||
|
||||
# Return the base directory which contains the current test module.
|
||||
def _get_base_dir(request):
|
||||
return request.path.resolve().parent # pathlib.Path instance
|
||||
|
||||
|
||||
# Directory with Python scripts for functional tests.
|
||||
def _get_script_dir(request):
|
||||
return _get_base_dir(request) / 'scripts'
|
||||
|
||||
|
||||
# Directory with testing modules used in some tests.
|
||||
def _get_modules_dir(request):
|
||||
return _get_base_dir(request) / 'modules'
|
||||
|
||||
|
||||
# Directory with .toc log files.
|
||||
def _get_logs_dir(request):
|
||||
return _get_base_dir(request) / 'logs'
|
||||
|
||||
|
||||
# Return the directory where data for tests is located.
|
||||
def _get_data_dir(request):
|
||||
return _get_base_dir(request) / 'data'
|
||||
|
||||
|
||||
# Directory with .spec files used in some tests.
|
||||
def _get_spec_dir(request):
|
||||
return _get_base_dir(request) / 'specs'
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def spec_dir(request):
|
||||
"""
|
||||
Return the directory where the test spec-files reside.
|
||||
"""
|
||||
return _get_spec_dir(request)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def script_dir(request):
|
||||
"""
|
||||
Return the directory where the test scripts reside.
|
||||
"""
|
||||
return _get_script_dir(request)
|
||||
|
||||
|
||||
# A fixture that copies test's data directory into test's temporary directory. The data directory is assumed to be
|
||||
# `data/{test-name}` found next to the .py file that contains test.
|
||||
@pytest.fixture
|
||||
def data_dir(
|
||||
# The request object for this test. Used to infer name of the test and location of the source .py file.
|
||||
# See
|
||||
# https://pytest.org/latest/builtin.html#_pytest.python.FixtureRequest
|
||||
# and
|
||||
# https://pytest.org/latest/fixture.html#fixtures-can-introspect-the-requesting-test-context.
|
||||
request,
|
||||
# The tmp_path object for this test. See: https://pytest.org/latest/tmp_path.html.
|
||||
tmp_path
|
||||
):
|
||||
# Strip the leading 'test_' from the test's name.
|
||||
test_name = request.function.__name__[5:]
|
||||
|
||||
# Copy to data dir and return the path.
|
||||
source_data_dir = _get_data_dir(request) / test_name
|
||||
tmp_data_dir = tmp_path / 'data'
|
||||
# Copy the data.
|
||||
shutil.copytree(source_data_dir, tmp_data_dir)
|
||||
# Return the temporary data directory, so that the copied data can now be used.
|
||||
return tmp_data_dir
|
||||
|
||||
|
||||
class AppBuilder:
|
||||
def __init__(self, tmp_path, request, bundle_mode):
|
||||
self._tmp_path = tmp_path
|
||||
self._request = request
|
||||
self._mode = bundle_mode
|
||||
self._spec_dir = tmp_path
|
||||
self._dist_dir = tmp_path / 'dist'
|
||||
self._build_dir = tmp_path / 'build'
|
||||
self._is_spec = False
|
||||
|
||||
def test_spec(self, specfile, *args, **kwargs):
|
||||
"""
|
||||
Test a Python script that is referenced in the supplied .spec file.
|
||||
"""
|
||||
__tracebackhide__ = True
|
||||
specfile = _get_spec_dir(self._request) / specfile
|
||||
# 'test_script' should handle .spec properly as script.
|
||||
self._is_spec = True
|
||||
return self.test_script(specfile, *args, **kwargs)
|
||||
|
||||
def test_source(self, source, *args, **kwargs):
|
||||
"""
|
||||
Test a Python script given as source code.
|
||||
|
||||
The source will be written into a file named like the test-function. This file will then be passed to
|
||||
`test_script`. If you need other related file, e.g., as `.toc`-file for testing the content, put it at at the
|
||||
normal place. Just mind to take the basnename from the test-function's name.
|
||||
|
||||
:param script: Source code to create executable from. This will be saved into a temporary file which is then
|
||||
passed on to `test_script`.
|
||||
|
||||
:param test_id: Test-id for parametrized tests. If given, it will be appended to the script filename, separated
|
||||
by two underscores.
|
||||
|
||||
All other arguments are passed straight on to `test_script`.
|
||||
"""
|
||||
__tracebackhide__ = True
|
||||
# For parametrized test append the test-id.
|
||||
scriptfile = gen_sourcefile(self._tmp_path, source, kwargs.setdefault('test_id'))
|
||||
del kwargs['test_id']
|
||||
return self.test_script(scriptfile, *args, **kwargs)
|
||||
|
||||
def _display_message(self, step_name, message):
|
||||
# Print the given message to both stderr and stdout, and it with APP-BUILDER to make it clear where it
|
||||
# originates from.
|
||||
print(f'[APP-BUILDER:{step_name}] {message}', file=sys.stdout)
|
||||
print(f'[APP-BUILDER:{step_name}] {message}', file=sys.stderr)
|
||||
|
||||
def test_script(
|
||||
self, script, pyi_args=None, app_name=None, app_args=None, runtime=None, run_from_path=False, **kwargs
|
||||
):
|
||||
"""
|
||||
Main method to wrap all phases of testing a Python script.
|
||||
|
||||
:param script: Name of script to create executable from.
|
||||
:param pyi_args: Additional arguments to pass to PyInstaller when creating executable.
|
||||
:param app_name: Name of the executable. This is equivalent to argument --name=APPNAME.
|
||||
:param app_args: Additional arguments to pass to
|
||||
:param runtime: Time in seconds how long to keep executable running.
|
||||
:param toc_log: List of modules that are expected to be bundled with the executable.
|
||||
"""
|
||||
__tracebackhide__ = True
|
||||
|
||||
# Skip interactive tests (the ones with `runtime` set) if `psutil` is unavailable, as we need it to properly
|
||||
# clean up the process tree.
|
||||
if runtime and psutil is None:
|
||||
pytest.skip('Interactive tests require psutil for proper cleanup.')
|
||||
|
||||
if pyi_args is None:
|
||||
pyi_args = []
|
||||
if app_args is None:
|
||||
app_args = []
|
||||
|
||||
if app_name:
|
||||
if not self._is_spec:
|
||||
pyi_args.extend(['--name', app_name])
|
||||
else:
|
||||
# Derive name from script name.
|
||||
app_name = os.path.splitext(os.path.basename(script))[0]
|
||||
|
||||
# Relative path means that a script from _script_dir is referenced.
|
||||
if not os.path.isabs(script):
|
||||
script = _get_script_dir(self._request) / script
|
||||
self.script = str(script) # might be a pathlib.Path at this point!
|
||||
assert os.path.exists(self.script), f'Script {self.script!r} not found.'
|
||||
|
||||
self._display_message('TEST-SCRIPT', 'Starting build...')
|
||||
if not self._test_building(args=pyi_args):
|
||||
pytest.fail(f'Building of {script} failed.')
|
||||
|
||||
self._display_message('TEST-SCRIPT', 'Build finished, now running executable...')
|
||||
self._test_executables(app_name, args=app_args, runtime=runtime, run_from_path=run_from_path, **kwargs)
|
||||
self._display_message('TEST-SCRIPT', 'Running executable finished.')
|
||||
|
||||
def _test_executables(self, name, args, runtime, run_from_path, **kwargs):
|
||||
"""
|
||||
Run created executable to make sure it works.
|
||||
|
||||
Multipackage-tests generate more than one exe-file and all of them have to be run.
|
||||
|
||||
:param args: CLI options to pass to the created executable.
|
||||
:param runtime: Time in seconds how long to keep the executable running.
|
||||
|
||||
:return: Exit code of the executable.
|
||||
"""
|
||||
__tracebackhide__ = True
|
||||
exes = self._find_executables(name)
|
||||
# Empty list means that PyInstaller probably failed to create any executable.
|
||||
assert exes != [], 'No executable file was found.'
|
||||
for exe in exes:
|
||||
# Try to find .toc log file. .toc log file has the same basename as exe file.
|
||||
toc_log = os.path.splitext(os.path.basename(exe))[0] + '.toc'
|
||||
toc_log = _get_logs_dir(self._request) / toc_log
|
||||
if toc_log.exists():
|
||||
if not self._examine_executable(exe, toc_log):
|
||||
pytest.fail(f'Matching .toc of {exe} failed.')
|
||||
retcode = self._run_executable(exe, args, run_from_path, runtime)
|
||||
if retcode != kwargs.get('retcode', 0):
|
||||
pytest.fail(f'Running exe {exe} failed with return-code {retcode}.')
|
||||
|
||||
def _find_executables(self, name):
|
||||
"""
|
||||
Search for all executables generated by the testcase.
|
||||
|
||||
If the test-case is called e.g. 'test_multipackage1', this is searching for each of 'test_multipackage1.exe'
|
||||
and 'multipackage1_?.exe' in both one-file- and one-dir-mode.
|
||||
|
||||
:param name: Name of the executable to look for.
|
||||
|
||||
:return: List of executables
|
||||
"""
|
||||
exes = []
|
||||
onedir_pt = str(self._dist_dir / name / name)
|
||||
onefile_pt = str(self._dist_dir / name)
|
||||
patterns = [
|
||||
onedir_pt,
|
||||
onefile_pt,
|
||||
# Multipackage one-dir
|
||||
onedir_pt + '_?',
|
||||
# Multipackage one-file
|
||||
onefile_pt + '_?'
|
||||
]
|
||||
# For Windows append .exe extension to patterns.
|
||||
if is_win:
|
||||
patterns = [pt + '.exe' for pt in patterns]
|
||||
# For macOS append pattern for .app bundles.
|
||||
if is_darwin:
|
||||
# e.g: ./dist/name.app/Contents/MacOS/name
|
||||
app_bundle_pt = str(self._dist_dir / f'{name}.app' / 'Contents' / 'MacOS' / name)
|
||||
patterns.append(app_bundle_pt)
|
||||
# Apply file patterns.
|
||||
for pattern in patterns:
|
||||
for prog in glob.glob(pattern):
|
||||
if os.path.isfile(prog):
|
||||
exes.append(prog)
|
||||
return exes
|
||||
|
||||
def _run_executable(self, prog, args, run_from_path, runtime):
|
||||
"""
|
||||
Run executable created by PyInstaller.
|
||||
|
||||
:param args: CLI options to pass to the created executable.
|
||||
"""
|
||||
# Run the test in a clean environment to make sure they're really self-contained.
|
||||
prog_env = copy.deepcopy(os.environ)
|
||||
prog_env['PATH'] = ''
|
||||
del prog_env['PATH']
|
||||
# For Windows we need to keep minimal PATH for successful running of some tests.
|
||||
if is_win:
|
||||
# Minimum Windows PATH is in most cases: C:\Windows\system32;C:\Windows
|
||||
prog_env['PATH'] = os.pathsep.join(winutils.get_system_path())
|
||||
# Same for Cygwin - if /usr/bin is not in PATH, cygwin1.dll cannot be discovered.
|
||||
if is_cygwin:
|
||||
prog_env['PATH'] = os.pathsep.join(['/usr/local/bin', '/usr/bin'])
|
||||
# On macOS, we similarly set up minimal PATH with system directories, in case utilities from there are used by
|
||||
# tested python code (for example, matplotlib >= 3.9.0 uses `system_profiler` that is found in /usr/sbin).
|
||||
if is_darwin:
|
||||
# The following paths are registered when application is launched via Finder, and are a subset of what is
|
||||
# typically available in the shell.
|
||||
prog_env['PATH'] = os.pathsep.join(['/usr/bin', '/bin', '/usr/sbin', '/sbin'])
|
||||
|
||||
exe_path = prog
|
||||
if run_from_path:
|
||||
# Run executable in the temp directory. Add the directory containing the executable to $PATH. Basically,
|
||||
# pretend we are a shell executing the program from $PATH.
|
||||
prog_cwd = str(self._tmp_path)
|
||||
prog_name = os.path.basename(prog)
|
||||
prog_env['PATH'] = os.pathsep.join([prog_env.get('PATH', ''), os.path.dirname(prog)])
|
||||
|
||||
else:
|
||||
# Run executable in the directory where it is.
|
||||
prog_cwd = os.path.dirname(prog)
|
||||
# The executable will be called with argv[0] as relative not absolute path.
|
||||
prog_name = os.path.join(os.curdir, os.path.basename(prog))
|
||||
|
||||
args = [prog_name] + args
|
||||
# Using sys.stdout/sys.stderr for subprocess fixes printing messages in Windows command prompt. Py.test is then
|
||||
# able to collect stdout/sterr messages and display them if a test fails.
|
||||
return self._run_executable_(args, exe_path, prog_env, prog_cwd, runtime)
|
||||
|
||||
def _run_executable_(self, args, exe_path, prog_env, prog_cwd, runtime):
|
||||
# Use psutil.Popen, if available; otherwise, fall back to subprocess.Popen
|
||||
popen_implementation = subprocess.Popen if psutil is None else psutil.Popen
|
||||
|
||||
# Run the executable
|
||||
self._display_message('RUN-EXE', f'Running {exe_path!r}, args: {args!r}')
|
||||
process = popen_implementation(args, executable=exe_path, env=prog_env, cwd=prog_cwd)
|
||||
|
||||
# Wait for the process to finish. If no run-time (= timeout) is specified, we expect the process to exit on
|
||||
# its own, and use global _EXE_TIMEOUT. If run-time is specified, we expect the application to be running
|
||||
# for at least the specified amount of time, which is useful in "interactive" test applications that are not
|
||||
# expected exit on their own.
|
||||
stdout = stderr = None
|
||||
try:
|
||||
timeout = runtime if runtime else _EXE_TIMEOUT
|
||||
stdout, stderr = process.communicate(timeout=timeout)
|
||||
retcode = process.returncode
|
||||
self._display_message('RUN-EXE', f'Process exited on its own with return code {retcode}.')
|
||||
except (subprocess.TimeoutExpired) if psutil is None else (psutil.TimeoutExpired, subprocess.TimeoutExpired):
|
||||
if runtime:
|
||||
# When 'runtime' is set, the expired timeout is a good sign that the executable was running successfully
|
||||
# for the specified time.
|
||||
self._display_message('RUN-EXE', f'Process reached expected run-time of {runtime} seconds.')
|
||||
retcode = 0
|
||||
else:
|
||||
# Executable is still running and it is not interactive. Clean up the process tree, and fail the test.
|
||||
self._display_message('RUN-EXE', f'Timeout while running executable (timeout: {timeout} seconds)!')
|
||||
retcode = 1
|
||||
|
||||
if psutil is None:
|
||||
# We are using subprocess.Popen(). Without psutil, we have no access to process tree; this poses a
|
||||
# problem for onefile builds, where we would need to first kill the child (main application) process,
|
||||
# and let the onefile parent perform its cleanup. As a best-effort approach, we can first call
|
||||
# process.terminate(); on POSIX systems, this sends SIGTERM to the parent process, and in most
|
||||
# situations, the bootloader will forward it to the child process. Then wait 5 seconds, and call
|
||||
# process.kill() if necessary. On Windows, however, both process.terminate() and process.kill() do
|
||||
# the same. Therefore, we should avoid running "interactive" tests with expected run-time if we do
|
||||
# not have psutil available.
|
||||
try:
|
||||
self._display_message('RUN-EXE', 'Stopping the process using Popen.terminate()...')
|
||||
process.terminate()
|
||||
stdout, stderr = process.communicate(timeout=5)
|
||||
self._display_message('RUN-EXE', 'Process stopped.')
|
||||
except subprocess.TimeoutExpired:
|
||||
# Kill the process.
|
||||
try:
|
||||
self._display_message('RUN-EXE', 'Stopping the process using Popen.kill()...')
|
||||
process.kill()
|
||||
# process.communicate() waits for end-of-file, which may never arrive if there is a child
|
||||
# process still alive. Nothing we can really do about it here, so add a short timeout and
|
||||
# display a warning.
|
||||
stdout, stderr = process.communicate(timeout=1)
|
||||
self._display_message('RUN-EXE', 'Process stopped.')
|
||||
except subprocess.TimeoutExpired:
|
||||
self._display_message('RUN-EXE', 'Failed to stop the process (or its child process(es))!')
|
||||
else:
|
||||
# We are using psutil.Popen(). First, force-kill all child processes; in onefile mode, this includes
|
||||
# the application process, whose termination should trigger cleanup and exit of the parent onefile
|
||||
# process.
|
||||
self._display_message('RUN-EXE', 'Stopping child processes...')
|
||||
for child_process in list(process.children(recursive=True)):
|
||||
with contextlib.suppress(psutil.NoSuchProcess):
|
||||
self._display_message('RUN-EXE', f'Stopping child process {child_process.pid}...')
|
||||
child_process.kill()
|
||||
|
||||
# Give the main process 5 seconds to exit on its own (to accommodate cleanup in onefile mode).
|
||||
try:
|
||||
self._display_message('RUN-EXE', f'Waiting for main process ({process.pid}) to stop...')
|
||||
stdout, stderr = process.communicate(timeout=5)
|
||||
self._display_message('RUN-EXE', 'Process stopped on its own.')
|
||||
except (psutil.TimeoutExpired, subprocess.TimeoutExpired):
|
||||
# End of the line - kill the main process.
|
||||
self._display_message('RUN-EXE', 'Stopping the process using Popen.kill()...')
|
||||
with contextlib.suppress(psutil.NoSuchProcess):
|
||||
process.kill()
|
||||
# Try to retrieve stdout/stderr - but keep a short timeout, just in case...
|
||||
try:
|
||||
stdout, stderr = process.communicate(timeout=1)
|
||||
self._display_message('RUN-EXE', 'Process stopped.')
|
||||
except (psutil.TimeoutExpired, subprocess.TimeoutExpire):
|
||||
self._display_message('RUN-EXE', 'Failed to stop the process (or its child process(es))!')
|
||||
|
||||
self._display_message('RUN-EXE', f'Done! Return code: {retcode}')
|
||||
|
||||
return retcode
|
||||
|
||||
def _test_building(self, args):
|
||||
"""
|
||||
Run building of test script.
|
||||
|
||||
:param args: additional CLI options for PyInstaller.
|
||||
|
||||
Return True if build succeeded False otherwise.
|
||||
"""
|
||||
if self._is_spec:
|
||||
default_args = [
|
||||
'--distpath', str(self._dist_dir),
|
||||
'--workpath', str(self._build_dir),
|
||||
'--log-level', 'INFO',
|
||||
] # yapf: disable
|
||||
else:
|
||||
default_args = [
|
||||
'--debug=bootloader',
|
||||
'--noupx',
|
||||
'--specpath', str(self._spec_dir),
|
||||
'--distpath', str(self._dist_dir),
|
||||
'--workpath', str(self._build_dir),
|
||||
'--path', str(_get_modules_dir(self._request)),
|
||||
'--log-level', 'INFO',
|
||||
] # yapf: disable
|
||||
|
||||
# Choose bundle mode.
|
||||
if self._mode == 'onedir':
|
||||
default_args.append('--onedir')
|
||||
elif self._mode == 'onefile':
|
||||
default_args.append('--onefile')
|
||||
# if self._mode is None then just the spec file was supplied.
|
||||
|
||||
pyi_args = [self.script, *default_args, *args]
|
||||
# TODO: fix return code in running PyInstaller programmatically.
|
||||
PYI_CONFIG = configure.get_config()
|
||||
# Override CACHEDIR for PyInstaller; relocate cache into `self._tmp_path`.
|
||||
PYI_CONFIG['cachedir'] = str(self._tmp_path)
|
||||
|
||||
pyi_main.run(pyi_args, PYI_CONFIG)
|
||||
retcode = 0
|
||||
|
||||
return retcode == 0
|
||||
|
||||
def _examine_executable(self, exe, toc_log):
|
||||
"""
|
||||
Compare log files (now used mostly by multipackage test_name).
|
||||
|
||||
:return: True if .toc files match
|
||||
"""
|
||||
self._display_message('EXAMINE-EXE', f'Matching against TOC log: {str(toc_log)!r}')
|
||||
fname_list = pkg_archive_contents(exe)
|
||||
with open(toc_log, 'r', encoding='utf-8') as f:
|
||||
pattern_list = eval(f.read())
|
||||
# Alphabetical order of patterns.
|
||||
pattern_list.sort()
|
||||
missing = []
|
||||
for pattern in pattern_list:
|
||||
for fname in fname_list:
|
||||
if re.match(pattern, fname):
|
||||
self._display_message('EXAMINE-EXE', f'Entry found: {pattern!r} --> {fname!r}')
|
||||
break
|
||||
else:
|
||||
# No matching entry found
|
||||
missing.append(pattern)
|
||||
self._display_message('EXAMINE-EXE', f'Entry MISSING: {pattern!r}')
|
||||
|
||||
# We expect the missing list to be empty
|
||||
return not missing
|
||||
|
||||
|
||||
# Scope 'session' should keep the object unchanged for whole tests. This fixture caches basic module graph dependencies
|
||||
# that are same for every executable.
|
||||
@pytest.fixture(scope='session')
|
||||
def pyi_modgraph():
|
||||
# Explicitly set the log level since the plugin `pytest-catchlog` (un-) sets the root logger's level to NOTSET for
|
||||
# the setup phase, which will lead to TRACE messages been written out.
|
||||
import PyInstaller.log as logging
|
||||
logging.logger.setLevel(logging.DEBUG)
|
||||
initialize_modgraph()
|
||||
|
||||
|
||||
# Run by default test as onedir and onefile.
|
||||
@pytest.fixture(params=['onedir', 'onefile'])
|
||||
def pyi_builder(tmp_path, monkeypatch, request, pyi_modgraph):
|
||||
# Save/restore environment variable PATH.
|
||||
monkeypatch.setenv('PATH', os.environ['PATH'])
|
||||
# PyInstaller or a test case might manipulate 'sys.path'. Reset it for every test.
|
||||
monkeypatch.syspath_prepend(None)
|
||||
# Set current working directory to
|
||||
monkeypatch.chdir(tmp_path)
|
||||
# Clean up configuration and force PyInstaller to do a clean configuration for another app/test. The value is same
|
||||
# as the original value.
|
||||
monkeypatch.setattr('PyInstaller.config.CONF', {'pathex': []})
|
||||
|
||||
yield AppBuilder(tmp_path, request, request.param)
|
||||
|
||||
# Clean up the temporary directory of a successful test
|
||||
if _PYI_BUILDER_CLEANUP and request.node.rep_setup.passed and request.node.rep_call.passed:
|
||||
if tmp_path.exists():
|
||||
shutil.rmtree(tmp_path, ignore_errors=True)
|
||||
|
||||
|
||||
# Fixture for .spec based tests. With .spec it does not make sense to differentiate onefile/onedir mode.
|
||||
@pytest.fixture
|
||||
def pyi_builder_spec(tmp_path, request, monkeypatch, pyi_modgraph):
|
||||
# Save/restore environment variable PATH.
|
||||
monkeypatch.setenv('PATH', os.environ['PATH'])
|
||||
# Set current working directory to
|
||||
monkeypatch.chdir(tmp_path)
|
||||
# PyInstaller or a test case might manipulate 'sys.path'. Reset it for every test.
|
||||
monkeypatch.syspath_prepend(None)
|
||||
# Clean up configuration and force PyInstaller to do a clean configuration for another app/test. The value is same
|
||||
# as the original value.
|
||||
monkeypatch.setattr('PyInstaller.config.CONF', {'pathex': []})
|
||||
|
||||
yield AppBuilder(tmp_path, request, None)
|
||||
|
||||
# Clean up the temporary directory of a successful test
|
||||
if _PYI_BUILDER_CLEANUP and request.node.rep_setup.passed and request.node.rep_call.passed:
|
||||
if tmp_path.exists():
|
||||
shutil.rmtree(tmp_path, ignore_errors=True)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pyi_windowed_builder(pyi_builder: AppBuilder):
|
||||
"""A pyi_builder equivalent for testing --windowed applications."""
|
||||
|
||||
# psutil.Popen() somehow bypasses an application's windowed/console mode so that any application built in
|
||||
# --windowed mode but invoked with psutil still receives valid std{in,out,err} handles and behaves exactly like
|
||||
# a console application. In short, testing windowed mode with psutil is a null test. We must instead use subprocess.
|
||||
|
||||
def _run_executable_(args, exe_path, prog_env, prog_cwd, runtime):
|
||||
return subprocess.run([exe_path, *args], env=prog_env, cwd=prog_cwd, timeout=runtime).returncode
|
||||
|
||||
pyi_builder._run_executable_ = _run_executable_
|
||||
yield pyi_builder
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,401 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2005-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
|
||||
# language=rst
|
||||
"""
|
||||
Additional helper methods for working specifically with Anaconda distributions are found at
|
||||
:mod:`PyInstaller.utils.hooks.conda_support<PyInstaller.utils.hooks.conda>`
|
||||
which is designed to mimic (albeit loosely) the `importlib.metadata`_ package. These functions find and parse the
|
||||
distribution metadata from json files located in the ``conda-meta`` directory.
|
||||
|
||||
.. versionadded:: 4.2.0
|
||||
|
||||
This module is available only if run inside a Conda environment. Usage of this module should therefore be wrapped in
|
||||
a conditional clause::
|
||||
|
||||
from PyInstaller.compat import is_pure_conda
|
||||
|
||||
if is_pure_conda:
|
||||
from PyInstaller.utils.hooks import conda_support
|
||||
|
||||
# Code goes here. e.g.
|
||||
binaries = conda_support.collect_dynamic_libs("numpy")
|
||||
...
|
||||
|
||||
Packages are all referenced by the *distribution name* you use to install it, rather than the *package name* you import
|
||||
it with. I.e., use ``distribution("pillow")`` instead of ``distribution("PIL")`` or use ``package_distribution("PIL")``.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import fnmatch
|
||||
import json
|
||||
import pathlib
|
||||
import sys
|
||||
from typing import Iterable, List
|
||||
from importlib.metadata import PackagePath as _PackagePath
|
||||
|
||||
from PyInstaller import compat
|
||||
from PyInstaller.log import logger
|
||||
|
||||
# Conda virtual environments each get their own copy of `conda-meta` so the use of `sys.prefix` instead of
|
||||
# `sys.base_prefix`, `sys.real_prefix` or anything from our `compat` module is intentional.
|
||||
CONDA_ROOT = pathlib.Path(sys.prefix)
|
||||
CONDA_META_DIR = CONDA_ROOT / "conda-meta"
|
||||
|
||||
# Find all paths in `sys.path` that are inside Conda root.
|
||||
PYTHONPATH_PREFIXES = []
|
||||
for _path in sys.path:
|
||||
_path = pathlib.Path(_path)
|
||||
try:
|
||||
PYTHONPATH_PREFIXES.append(_path.relative_to(sys.prefix))
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
PYTHONPATH_PREFIXES.sort(key=lambda p: len(p.parts), reverse=True)
|
||||
|
||||
|
||||
class Distribution:
|
||||
"""
|
||||
A bucket class representation of a Conda distribution.
|
||||
|
||||
This bucket exports the following attributes:
|
||||
|
||||
:ivar name: The distribution's name.
|
||||
:ivar version: Its version.
|
||||
:ivar files: All filenames as :meth:`PackagePath`\\ s included with this distribution.
|
||||
:ivar dependencies: Names of other distributions that this distribution depends on (with version constraints
|
||||
removed).
|
||||
:ivar packages: Names of importable packages included in this distribution.
|
||||
|
||||
This class is not intended to be constructed directly by users. Rather use :meth:`distribution` or
|
||||
:meth:`package_distribution` to provide one for you.
|
||||
"""
|
||||
def __init__(self, json_path: str):
|
||||
try:
|
||||
self._json_path = pathlib.Path(json_path)
|
||||
assert self._json_path.exists()
|
||||
except (TypeError, AssertionError):
|
||||
raise TypeError(
|
||||
"Distribution requires a path to a conda-meta json. Perhaps you want "
|
||||
"`distribution({})` instead?".format(repr(json_path))
|
||||
)
|
||||
|
||||
# Everything we need (including this distribution's name) is kept in the metadata json.
|
||||
self.raw: dict = json.loads(self._json_path.read_text())
|
||||
|
||||
# Unpack the more useful contents of the json.
|
||||
self.name: str = self.raw["name"]
|
||||
self.version: str = self.raw["version"]
|
||||
self.files = [PackagePath(i) for i in self.raw["files"]]
|
||||
self.dependencies = self._init_dependencies()
|
||||
self.packages = self._init_package_names()
|
||||
|
||||
def __repr__(self):
|
||||
return "{}(name=\"{}\", packages={})".format(type(self).__name__, self.name, self.packages)
|
||||
|
||||
def _init_dependencies(self):
|
||||
"""
|
||||
Read dependencies from ``self.raw["depends"]``.
|
||||
|
||||
:return: Dependent distribution names.
|
||||
:rtype: list
|
||||
|
||||
The names in ``self.raw["depends"]`` come with extra version constraint information which must be stripped.
|
||||
"""
|
||||
dependencies = []
|
||||
# For each dependency:
|
||||
for dependency in self.raw["depends"]:
|
||||
# ``dependency`` is a string of the form: "[name] [version constraints]"
|
||||
name, *version_constraints = dependency.split(maxsplit=1)
|
||||
dependencies.append(name)
|
||||
return dependencies
|
||||
|
||||
def _init_package_names(self):
|
||||
"""
|
||||
Search ``self.files`` for package names shipped by this distribution.
|
||||
|
||||
:return: Package names.
|
||||
:rtype: list
|
||||
|
||||
These are names you would ``import`` rather than names you would install.
|
||||
"""
|
||||
packages = []
|
||||
for file in self.files:
|
||||
package = _get_package_name(file)
|
||||
if package is not None:
|
||||
packages.append(package)
|
||||
return packages
|
||||
|
||||
@classmethod
|
||||
def from_name(cls, name: str):
|
||||
"""
|
||||
Get distribution information for a given distribution **name** (i.e., something you would ``conda install``).
|
||||
|
||||
:rtype: :class:`Distribution`
|
||||
"""
|
||||
if name in distributions:
|
||||
return distributions[name]
|
||||
raise ModuleNotFoundError(
|
||||
"Distribution {} is either not installed or was not installed using Conda.".format(name)
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def from_package_name(cls, name: str):
|
||||
"""
|
||||
Get distribution information for a **package** (i.e., something you would import).
|
||||
|
||||
:rtype: :class:`Distribution`
|
||||
|
||||
For example, the package ``pkg_resources`` belongs to the distribution ``setuptools``, which contains three
|
||||
packages.
|
||||
|
||||
>>> package_distribution("pkg_resources")
|
||||
Distribution(name="setuptools",
|
||||
packages=['easy_install', 'pkg_resources', 'setuptools'])
|
||||
"""
|
||||
if name in distributions_by_package:
|
||||
return distributions_by_package[name]
|
||||
raise ModuleNotFoundError("Package {} is either not installed or was not installed using Conda.".format(name))
|
||||
|
||||
|
||||
distribution = Distribution.from_name
|
||||
package_distribution = Distribution.from_package_name
|
||||
|
||||
|
||||
class PackagePath(_PackagePath):
|
||||
"""
|
||||
A filename relative to Conda's root (``sys.prefix``).
|
||||
|
||||
This class inherits from :class:`pathlib.PurePosixPath` even on non-Posix OSs. To convert to a :class:`pathlib.Path`
|
||||
pointing to the real file, use the :meth:`locate` method.
|
||||
"""
|
||||
def locate(self):
|
||||
"""
|
||||
Return a path-like object for this path pointing to the file's true location.
|
||||
"""
|
||||
return pathlib.Path(sys.prefix) / self
|
||||
|
||||
|
||||
def walk_dependency_tree(initial: str, excludes: Iterable[str] | None = None):
|
||||
"""
|
||||
Collect a :class:`Distribution` and all direct and indirect dependencies of that distribution.
|
||||
|
||||
Arguments:
|
||||
initial:
|
||||
Distribution name to collect from.
|
||||
excludes:
|
||||
Distributions to exclude.
|
||||
Returns:
|
||||
A ``{name: distribution}`` mapping where ``distribution`` is the output of
|
||||
:func:`conda_support.distribution(name) <distribution>`.
|
||||
"""
|
||||
if excludes is not None:
|
||||
excludes = set(excludes)
|
||||
|
||||
# Rather than use true recursion, mimic it with a to-do queue.
|
||||
from collections import deque
|
||||
done = {}
|
||||
names_to_do = deque([initial])
|
||||
|
||||
while names_to_do:
|
||||
# Grab a distribution name from the to-do list.
|
||||
name = names_to_do.pop()
|
||||
try:
|
||||
# Collect and save it's metadata.
|
||||
done[name] = distribution = Distribution.from_name(name)
|
||||
logger.debug("Collected Conda distribution '%s', a dependency of '%s'.", name, initial)
|
||||
except ModuleNotFoundError:
|
||||
logger.warning(
|
||||
"Conda distribution '%s', dependency of '%s', was not found. "
|
||||
"If you installed this distribution with pip then you may ignore this warning.", name, initial
|
||||
)
|
||||
continue
|
||||
# For each dependency:
|
||||
for _name in distribution.dependencies:
|
||||
if _name in done:
|
||||
# Skip anything already done.
|
||||
continue
|
||||
if _name == name:
|
||||
# Avoid infinite recursion if a distribution depends on itself. This will probably never happen but I
|
||||
# certainly would not chance it.
|
||||
continue
|
||||
if excludes is not None and _name in excludes:
|
||||
# Do not recurse to excluded dependencies.
|
||||
continue
|
||||
names_to_do.append(_name)
|
||||
return done
|
||||
|
||||
|
||||
def _iter_distributions(name, dependencies, excludes):
|
||||
if dependencies:
|
||||
return walk_dependency_tree(name, excludes).values()
|
||||
else:
|
||||
return [Distribution.from_name(name)]
|
||||
|
||||
|
||||
def requires(name: str, strip_versions: bool = False) -> List[str]:
|
||||
"""
|
||||
List requirements of a distribution.
|
||||
|
||||
Arguments:
|
||||
name:
|
||||
The name of the distribution.
|
||||
strip_versions:
|
||||
List only their names, not their version constraints.
|
||||
Returns:
|
||||
A list of distribution names.
|
||||
"""
|
||||
if strip_versions:
|
||||
return distribution(name).dependencies
|
||||
return distribution(name).raw["depends"]
|
||||
|
||||
|
||||
def files(name: str, dependencies: bool = False, excludes: list | None = None) -> List[PackagePath]:
|
||||
"""
|
||||
List all files belonging to a distribution.
|
||||
|
||||
Arguments:
|
||||
name:
|
||||
The name of the distribution.
|
||||
dependencies:
|
||||
Recursively collect files of dependencies too.
|
||||
excludes:
|
||||
Distributions to ignore if **dependencies** is true.
|
||||
Returns:
|
||||
All filenames belonging to the given distribution.
|
||||
|
||||
With ``dependencies=False``, this is just a shortcut for::
|
||||
|
||||
conda_support.distribution(name).files
|
||||
"""
|
||||
return [file for dist in _iter_distributions(name, dependencies, excludes) for file in dist.files]
|
||||
|
||||
|
||||
if compat.is_win:
|
||||
lib_dir = pathlib.PurePath("Library", "bin")
|
||||
else:
|
||||
lib_dir = pathlib.PurePath("lib")
|
||||
|
||||
|
||||
def collect_dynamic_libs(name: str, dest: str = ".", dependencies: bool = True, excludes: Iterable[str] | None = None):
|
||||
"""
|
||||
Collect DLLs for distribution **name**.
|
||||
|
||||
Arguments:
|
||||
name:
|
||||
The distribution's project-name.
|
||||
dest:
|
||||
Target destination, defaults to ``'.'``.
|
||||
dependencies:
|
||||
Recursively collect libs for dependent distributions (recommended).
|
||||
excludes:
|
||||
Dependent distributions to skip, defaults to ``None``.
|
||||
Returns:
|
||||
List of DLLs in PyInstaller's ``(source, dest)`` format.
|
||||
|
||||
This collects libraries only from Conda's shared ``lib`` (Unix) or ``Library/bin`` (Windows) folders. To collect
|
||||
from inside a distribution's installation use the regular :func:`PyInstaller.utils.hooks.collect_dynamic_libs`.
|
||||
"""
|
||||
DLL_SUFFIXES = ("*.dll", "*.dylib", "*.so", "*.so.*")
|
||||
_files = []
|
||||
for file in files(name, dependencies, excludes):
|
||||
# A file is classified as a dynamic library if:
|
||||
# 1) it lives inside the dedicated ``lib_dir`` DLL folder.
|
||||
#
|
||||
# NOTE: `file` is an instance of `PackagePath`, which inherits from `pathlib.PurePosixPath` even on Windows.
|
||||
# Therefore, it does not properly handle cases when metadata paths contain Windows-style separator, which does
|
||||
# seem to be used on some Windows installations (see #9113). Therefore, cast `file` to `pathlib.PurePath`
|
||||
# before comparing its parent to `lib_dir` (which should also be a `pathlib.PurePath`).
|
||||
if pathlib.PurePath(file).parent != lib_dir:
|
||||
continue
|
||||
# 2) it is a file (and not a directory or a symbolic link pointing to a directory)
|
||||
resolved_file = file.locate()
|
||||
if not resolved_file.is_file():
|
||||
continue
|
||||
# 3) has a correct suffix
|
||||
if not any([resolved_file.match(suffix) for suffix in DLL_SUFFIXES]):
|
||||
continue
|
||||
|
||||
_files.append((str(resolved_file), dest))
|
||||
return _files
|
||||
|
||||
|
||||
# --- Map packages to distributions and vice-versa ---
|
||||
|
||||
|
||||
def _get_package_name(file: PackagePath):
|
||||
"""
|
||||
Determine the package name of a Python file in :data:`sys.path`.
|
||||
|
||||
Arguments:
|
||||
file:
|
||||
A Python filename relative to Conda root (sys.prefix).
|
||||
Returns:
|
||||
Package name or None.
|
||||
|
||||
This function only considers single file packages e.g. ``foo.py`` or top level ``foo/__init__.py``\\ s.
|
||||
Anything else is ignored (returning ``None``).
|
||||
"""
|
||||
file = pathlib.Path(file)
|
||||
# TODO: Handle PEP 420 namespace packages (which are missing `__init__` module). No such Conda PEP 420 namespace
|
||||
# packages are known.
|
||||
|
||||
# Get top-level folders by finding parents of `__init__.xyz`s
|
||||
if file.stem == "__init__" and file.suffix in compat.ALL_SUFFIXES:
|
||||
file = file.parent
|
||||
elif file.suffix not in compat.ALL_SUFFIXES:
|
||||
# Keep single-file packages but skip DLLs, data and junk files.
|
||||
return
|
||||
|
||||
# Check if this file/folder's parent is in ``sys.path`` i.e. it's directly importable. This intentionally excludes
|
||||
# submodules which would cause confusion because ``sys.prefix`` is in ``sys.path``, meaning that every file in an
|
||||
# Conda installation is a submodule.
|
||||
for prefix in PYTHONPATH_PREFIXES:
|
||||
if len(file.parts) != len(prefix.parts) + 1:
|
||||
# This check is redundant but speeds it up quite a bit.
|
||||
continue
|
||||
# There are no wildcards involved here. The use of ``fnmatch`` is simply to handle the `if case-insensitive
|
||||
# file system: use case-insensitive string matching.`
|
||||
if fnmatch.fnmatch(str(file.parent), str(prefix)):
|
||||
return file.stem
|
||||
|
||||
|
||||
# All the information we want is organised the wrong way.
|
||||
|
||||
# We want to look up distribution based on package names, but we can only search for packages using distribution names.
|
||||
# And we would like to search for a distribution's json file, but, due to the noisy filenames of the jsons, we can only
|
||||
# find a json's distribution rather than a distribution's json.
|
||||
|
||||
# So we have to read everything, then regroup distributions in the ways we want them grouped. This will likely be a
|
||||
# spectacular bottleneck on full-blown Conda (non miniconda) with 250+ packages by default at several GiBs. I suppose we
|
||||
# could cache this on a per-json basis if it gets too much.
|
||||
|
||||
|
||||
def _init_distributions():
|
||||
distributions = {}
|
||||
for path in CONDA_META_DIR.glob("*.json"):
|
||||
dist = Distribution(path)
|
||||
distributions[dist.name] = dist
|
||||
return distributions
|
||||
|
||||
|
||||
distributions = _init_distributions()
|
||||
|
||||
|
||||
def _init_packages():
|
||||
distributions_by_package = {}
|
||||
for distribution in distributions.values():
|
||||
for package in distribution.packages:
|
||||
distributions_by_package[package] = distribution
|
||||
return distributions_by_package
|
||||
|
||||
|
||||
distributions_by_package = _init_packages()
|
||||
@@ -0,0 +1,152 @@
|
||||
# ----------------------------------------------------------------------------
|
||||
# Copyright (c) 2005-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
# ----------------------------------------------------------------------------
|
||||
import os
|
||||
|
||||
from PyInstaller import isolated
|
||||
|
||||
|
||||
@isolated.decorate
|
||||
def django_dottedstring_imports(django_root_dir):
|
||||
"""
|
||||
An isolated helper that returns list of all Django dependencies, parsed from the `mysite.settings` module.
|
||||
|
||||
NOTE: With newer version of Django this is most likely the part of PyInstaller that will be broken.
|
||||
|
||||
Tested with Django 2.2
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
|
||||
import PyInstaller.utils.misc
|
||||
from PyInstaller.utils import hooks as hookutils
|
||||
|
||||
# Extra search paths to add to sys.path:
|
||||
# - parent directory of the django_root_dir
|
||||
# - django_root_dir itself; often, Django users do not specify absolute imports in the settings module.
|
||||
search_paths = [
|
||||
PyInstaller.utils.misc.get_path_to_toplevel_modules(django_root_dir),
|
||||
django_root_dir,
|
||||
]
|
||||
sys.path += search_paths
|
||||
|
||||
# Set the path to project's settings module
|
||||
default_settings_module = os.path.basename(django_root_dir) + '.settings'
|
||||
settings_module = os.environ.get('DJANGO_SETTINGS_MODULE', default_settings_module)
|
||||
os.environ['DJANGO_SETTINGS_MODULE'] = settings_module
|
||||
|
||||
# Calling django.setup() avoids the exception AppRegistryNotReady() and also reads the user settings
|
||||
# from DJANGO_SETTINGS_MODULE.
|
||||
# https://stackoverflow.com/questions/24793351/django-appregistrynotready
|
||||
import django # noqa: E402
|
||||
|
||||
django.setup()
|
||||
|
||||
# This allows to access all django settings even from the settings.py module.
|
||||
from django.conf import settings # noqa: E402
|
||||
|
||||
hiddenimports = list(settings.INSTALLED_APPS)
|
||||
|
||||
# Do not fail script when settings does not have such attributes.
|
||||
if hasattr(settings, 'TEMPLATE_CONTEXT_PROCESSORS'):
|
||||
hiddenimports += list(settings.TEMPLATE_CONTEXT_PROCESSORS)
|
||||
|
||||
if hasattr(settings, 'TEMPLATE_LOADERS'):
|
||||
hiddenimports += list(settings.TEMPLATE_LOADERS)
|
||||
|
||||
hiddenimports += [settings.ROOT_URLCONF]
|
||||
|
||||
def _remove_class(class_name):
|
||||
return '.'.join(class_name.split('.')[0:-1])
|
||||
|
||||
#-- Changes in Django 1.7.
|
||||
|
||||
# Remove class names and keep just modules.
|
||||
if hasattr(settings, 'AUTHENTICATION_BACKENDS'):
|
||||
for cl in settings.AUTHENTICATION_BACKENDS:
|
||||
cl = _remove_class(cl)
|
||||
hiddenimports.append(cl)
|
||||
# Deprecated since 4.2, may be None until it is removed
|
||||
cl = getattr(settings, 'DEFAULT_FILE_STORAGE', None)
|
||||
if cl:
|
||||
hiddenimports.append(_remove_class(cl))
|
||||
if hasattr(settings, 'FILE_UPLOAD_HANDLERS'):
|
||||
for cl in settings.FILE_UPLOAD_HANDLERS:
|
||||
cl = _remove_class(cl)
|
||||
hiddenimports.append(cl)
|
||||
if hasattr(settings, 'MIDDLEWARE_CLASSES'):
|
||||
for cl in settings.MIDDLEWARE_CLASSES:
|
||||
cl = _remove_class(cl)
|
||||
hiddenimports.append(cl)
|
||||
# Templates is a dict:
|
||||
if hasattr(settings, 'TEMPLATES'):
|
||||
for templ in settings.TEMPLATES:
|
||||
backend = _remove_class(templ['BACKEND'])
|
||||
hiddenimports.append(backend)
|
||||
# Include context_processors.
|
||||
if hasattr(templ, 'OPTIONS'):
|
||||
if hasattr(templ['OPTIONS'], 'context_processors'):
|
||||
# Context processors are functions - strip last word.
|
||||
mods = templ['OPTIONS']['context_processors']
|
||||
mods = [_remove_class(x) for x in mods]
|
||||
hiddenimports += mods
|
||||
# Include database backends - it is a dict.
|
||||
for v in settings.DATABASES.values():
|
||||
hiddenimports.append(v['ENGINE'])
|
||||
|
||||
# Add templatetags and context processors for each installed app.
|
||||
for app in settings.INSTALLED_APPS:
|
||||
app_templatetag_module = app + '.templatetags'
|
||||
app_ctx_proc_module = app + '.context_processors'
|
||||
hiddenimports.append(app_templatetag_module)
|
||||
hiddenimports += hookutils.collect_submodules(app_templatetag_module)
|
||||
hiddenimports.append(app_ctx_proc_module)
|
||||
|
||||
# Deduplicate imports.
|
||||
hiddenimports = list(set(hiddenimports))
|
||||
|
||||
# Return the hidden imports
|
||||
return hiddenimports
|
||||
|
||||
|
||||
def django_find_root_dir():
|
||||
"""
|
||||
Return path to directory (top-level Python package) that contains main django files. Return None if no directory
|
||||
was detected.
|
||||
|
||||
Main Django project directory contain files like '__init__.py', 'settings.py' and 'url.py'.
|
||||
|
||||
In Django 1.4+ the script 'manage.py' is not in the directory with 'settings.py' but usually one level up. We
|
||||
need to detect this special case too.
|
||||
"""
|
||||
# 'PyInstaller.config' cannot be imported as other top-level modules.
|
||||
from PyInstaller.config import CONF
|
||||
|
||||
# Get the directory with manage.py. Manage.py is supplied to PyInstaller as the first main executable script.
|
||||
manage_py = CONF['main_script']
|
||||
manage_dir = os.path.dirname(os.path.abspath(manage_py))
|
||||
|
||||
# Get the Django root directory. The directory that contains settings.py and url.py. It could be the directory
|
||||
# containing manage.py or any of its subdirectories.
|
||||
settings_dir = None
|
||||
files = set(os.listdir(manage_dir))
|
||||
if ('settings.py' in files or 'settings' in files) and 'urls.py' in files:
|
||||
settings_dir = manage_dir
|
||||
else:
|
||||
for f in files:
|
||||
if os.path.isdir(os.path.join(manage_dir, f)):
|
||||
subfiles = os.listdir(os.path.join(manage_dir, f))
|
||||
# Subdirectory contains critical files.
|
||||
if ('settings.py' in subfiles or 'settings' in subfiles) and 'urls.py' in subfiles:
|
||||
settings_dir = os.path.join(manage_dir, f)
|
||||
break # Find the first directory.
|
||||
|
||||
return settings_dir
|
||||
@@ -0,0 +1,457 @@
|
||||
# ----------------------------------------------------------------------------
|
||||
# Copyright (c) 2005-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
# ----------------------------------------------------------------------------
|
||||
import os
|
||||
import pathlib
|
||||
import shutil
|
||||
import subprocess
|
||||
import hashlib
|
||||
import re
|
||||
|
||||
from PyInstaller.depend.utils import _resolveCtypesImports
|
||||
from PyInstaller.utils.hooks import collect_submodules, collect_system_data_files, get_hook_config
|
||||
from PyInstaller import isolated
|
||||
from PyInstaller import log as logging
|
||||
from PyInstaller import compat
|
||||
from PyInstaller.depend.bindepend import findSystemLibrary
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class GiModuleInfo:
|
||||
def __init__(self, module, version, hook_api=None):
|
||||
self.name = module
|
||||
self.version = version
|
||||
self.available = False
|
||||
self.sharedlibs = []
|
||||
self.typelib = None
|
||||
self.dependencies = []
|
||||
|
||||
# If hook API is available, use it to override the version from hookconfig.
|
||||
if hook_api is not None:
|
||||
module_versions = get_hook_config(hook_api, 'gi', 'module-versions')
|
||||
if module_versions:
|
||||
self.version = module_versions.get(module, version)
|
||||
|
||||
logger.debug("Gathering GI module info for %s %s", module, self.version)
|
||||
|
||||
@isolated.decorate
|
||||
def _get_module_info(module, version):
|
||||
import gi
|
||||
|
||||
# Ideally, we would use gi.Repository, which provides common abstraction for some of the functions we use in
|
||||
# this codepath (e.g., `require`, `get_typelib_path`, `get_immediate_dependencies`). However, it lacks the
|
||||
# `get_shared_library` function, which is why we are using "full" bindings via `gi.repository.GIRepository`.
|
||||
#
|
||||
# PyGObject 3.52.0 switched from girepository-1.0 to girepository-2.0, which means that GIRepository version
|
||||
# has changed from 2.0 to 3.0 and some of the API has changed.
|
||||
try:
|
||||
gi.require_version("GIRepository", "3.0")
|
||||
new_api = True
|
||||
except ValueError:
|
||||
gi.require_version("GIRepository", "2.0")
|
||||
new_api = False
|
||||
|
||||
from gi.repository import GIRepository
|
||||
|
||||
# The old API had `get_default` method to obtain global singleton object; it was removed in the new API,
|
||||
# which requires creation of separate GIRepository instances.
|
||||
if new_api:
|
||||
repo = GIRepository.Repository()
|
||||
try:
|
||||
repo.require(module, version, GIRepository.RepositoryLoadFlags.LAZY)
|
||||
except ValueError:
|
||||
return None # Module not available
|
||||
|
||||
# The new API returns the list of shared libraries.
|
||||
sharedlibs = repo.get_shared_libraries(module)
|
||||
else:
|
||||
repo = GIRepository.Repository.get_default()
|
||||
try:
|
||||
repo.require(module, version, GIRepository.RepositoryLoadFlags.IREPOSITORY_LOAD_FLAG_LAZY)
|
||||
except ValueError:
|
||||
return None # Module not available
|
||||
|
||||
# Shared library/libraries
|
||||
# Comma-separated list of paths to shared libraries, or None if none are associated. Convert to list.
|
||||
sharedlibs = repo.get_shared_library(module)
|
||||
sharedlibs = [lib.strip() for lib in sharedlibs.split(",")] if sharedlibs else []
|
||||
|
||||
# Path to .typelib file
|
||||
typelib = repo.get_typelib_path(module)
|
||||
|
||||
# Dependencies
|
||||
# GIRepository.Repository.get_immediate_dependencies is available from gobject-introspection v1.44 on
|
||||
if hasattr(repo, 'get_immediate_dependencies'):
|
||||
dependencies = repo.get_immediate_dependencies(module)
|
||||
else:
|
||||
dependencies = repo.get_dependencies(module)
|
||||
|
||||
return {
|
||||
'sharedlibs': sharedlibs,
|
||||
'typelib': typelib,
|
||||
'dependencies': dependencies,
|
||||
}
|
||||
|
||||
# Try to query information; if this fails, mark module as unavailable.
|
||||
try:
|
||||
info = _get_module_info(module, self.version)
|
||||
if info is None:
|
||||
logger.debug("GI module info %s %s not found.", module, self.version)
|
||||
else:
|
||||
logger.debug("GI module info %s %s found.", module, self.version)
|
||||
self.sharedlibs = info['sharedlibs']
|
||||
self.typelib = info['typelib']
|
||||
self.dependencies = info['dependencies']
|
||||
self.available = True
|
||||
except Exception as e:
|
||||
logger.warning("Failed to query GI module %s %s: %s", module, self.version, e)
|
||||
|
||||
def get_libdir(self):
|
||||
"""
|
||||
Return the path to shared library used by the module. If no libraries are associated with the typelib, None is
|
||||
returned. If multiple library names are associated with the typelib, the path to the first resolved shared
|
||||
library is returned. Raises exception if module is unavailable or none of the shared libraries could be
|
||||
resolved.
|
||||
"""
|
||||
# Module unavailable
|
||||
if not self.available:
|
||||
raise ValueError(f"Module {self.name} {self.version} is unavailable!")
|
||||
# Module has no associated shared libraries
|
||||
if not self.sharedlibs:
|
||||
return None
|
||||
for lib in self.sharedlibs:
|
||||
path = findSystemLibrary(lib)
|
||||
if path:
|
||||
return os.path.normpath(os.path.dirname(path))
|
||||
raise ValueError(f"Could not resolve any shared library of {self.name} {self.version}: {self.sharedlibs}!")
|
||||
|
||||
def collect_typelib_data(self):
|
||||
"""
|
||||
Return a tuple of (binaries, datas, hiddenimports) to be used by PyGObject related hooks.
|
||||
"""
|
||||
datas = []
|
||||
binaries = []
|
||||
hiddenimports = []
|
||||
|
||||
logger.debug("Collecting module data for %s %s", self.name, self.version)
|
||||
|
||||
# Module unavailable
|
||||
if not self.available:
|
||||
raise ValueError(f"Module {self.name} {self.version} is unavailable!")
|
||||
|
||||
# Find shared libraries
|
||||
resolved_libs = _resolveCtypesImports(self.sharedlibs)
|
||||
for resolved_lib in resolved_libs:
|
||||
logger.debug("Collecting shared library %s at %s", resolved_lib[0], resolved_lib[1])
|
||||
binaries.append((resolved_lib[1], "."))
|
||||
|
||||
# Find and collect .typelib file. Run it through the `gir_library_path_fix` to fix the library path, if
|
||||
# necessary.
|
||||
typelib_entry = gir_library_path_fix(self.typelib)
|
||||
if typelib_entry:
|
||||
logger.debug('Collecting gir typelib at %s', typelib_entry[0])
|
||||
datas.append(typelib_entry)
|
||||
|
||||
# Overrides for the module
|
||||
hiddenimports += collect_submodules('gi.overrides', lambda name: name.endswith('.' + self.name))
|
||||
|
||||
# Module dependencies
|
||||
for dep in self.dependencies:
|
||||
dep_module, _ = dep.rsplit('-', 1)
|
||||
hiddenimports += [f'gi.repository.{dep_module}']
|
||||
|
||||
return binaries, datas, hiddenimports
|
||||
|
||||
|
||||
# The old function, provided for backwards compatibility in 3rd party hooks.
|
||||
def get_gi_libdir(module, version):
|
||||
module_info = GiModuleInfo(module, version)
|
||||
return module_info.get_libdir()
|
||||
|
||||
|
||||
# The old function, provided for backwards compatibility in 3rd party hooks.
|
||||
def get_gi_typelibs(module, version):
|
||||
"""
|
||||
Return a tuple of (binaries, datas, hiddenimports) to be used by PyGObject related hooks. Searches for and adds
|
||||
dependencies recursively.
|
||||
|
||||
:param module: GI module name, as passed to 'gi.require_version()'
|
||||
:param version: GI module version, as passed to 'gi.require_version()'
|
||||
"""
|
||||
module_info = GiModuleInfo(module, version)
|
||||
return module_info.collect_typelib_data()
|
||||
|
||||
|
||||
def gir_library_path_fix(path):
|
||||
import subprocess
|
||||
|
||||
# 'PyInstaller.config' cannot be imported as other top-level modules.
|
||||
from PyInstaller.config import CONF
|
||||
|
||||
path = os.path.abspath(path)
|
||||
|
||||
# On macOS we need to recompile the GIR files to reference the loader path,
|
||||
# but this is not necessary on other platforms.
|
||||
if compat.is_darwin:
|
||||
|
||||
# If using a virtualenv, the base prefix and the path of the typelib
|
||||
# have really nothing to do with each other, so try to detect that.
|
||||
common_path = os.path.commonprefix([compat.base_prefix, path])
|
||||
if common_path == '/':
|
||||
logger.debug("virtualenv detected? fixing the gir path...")
|
||||
common_path = os.path.abspath(os.path.join(path, '..', '..', '..'))
|
||||
|
||||
gir_path = os.path.join(common_path, 'share', 'gir-1.0')
|
||||
|
||||
typelib_name = os.path.basename(path)
|
||||
gir_name = os.path.splitext(typelib_name)[0] + '.gir'
|
||||
|
||||
gir_file = os.path.join(gir_path, gir_name)
|
||||
|
||||
if not os.path.exists(gir_path):
|
||||
logger.error(
|
||||
"Unable to find gir directory: %s.\nTry installing your platform's gobject-introspection package.",
|
||||
gir_path
|
||||
)
|
||||
return None
|
||||
if not os.path.exists(gir_file):
|
||||
logger.error(
|
||||
"Unable to find gir file: %s.\nTry installing your platform's gobject-introspection package.", gir_file
|
||||
)
|
||||
return None
|
||||
|
||||
with open(gir_file, 'r', encoding='utf-8') as f:
|
||||
lines = f.readlines()
|
||||
# GIR files are `XML encoded <https://developer.gnome.org/gi/stable/gi-gir-reference.html>`_,
|
||||
# which means they are by definition encoded using UTF-8.
|
||||
with open(os.path.join(CONF['workpath'], gir_name), 'w', encoding='utf-8') as f:
|
||||
for line in lines:
|
||||
if 'shared-library' in line:
|
||||
split = re.split('(=)', line)
|
||||
files = re.split('(["|,])', split[2])
|
||||
for count, item in enumerate(files):
|
||||
if 'lib' in item:
|
||||
files[count] = '@loader_path/' + os.path.basename(item)
|
||||
line = ''.join(split[0:2]) + ''.join(files)
|
||||
f.write(line)
|
||||
|
||||
# g-ir-compiler expects a file so we cannot just pipe the fixed file to it.
|
||||
command = subprocess.Popen((
|
||||
'g-ir-compiler', os.path.join(CONF['workpath'], gir_name),
|
||||
'-o', os.path.join(CONF['workpath'], typelib_name)
|
||||
)) # yapf: disable
|
||||
command.wait()
|
||||
|
||||
return os.path.join(CONF['workpath'], typelib_name), 'gi_typelibs'
|
||||
else:
|
||||
return path, 'gi_typelibs'
|
||||
|
||||
|
||||
@isolated.decorate
|
||||
def get_glib_system_data_dirs():
|
||||
import gi
|
||||
gi.require_version('GLib', '2.0')
|
||||
from gi.repository import GLib
|
||||
return GLib.get_system_data_dirs()
|
||||
|
||||
|
||||
def get_glib_sysconf_dirs():
|
||||
"""
|
||||
Try to return the sysconf directories (e.g., /etc).
|
||||
"""
|
||||
if compat.is_win:
|
||||
# On Windows, if you look at gtkwin32.c, sysconfdir is actually relative to the location of the GTK DLL. Since
|
||||
# that is what we are actually interested in (not the user path), we have to do that the hard way...
|
||||
return [os.path.join(get_gi_libdir('GLib', '2.0'), 'etc')]
|
||||
|
||||
@isolated.call
|
||||
def data_dirs():
|
||||
import gi
|
||||
gi.require_version('GLib', '2.0')
|
||||
from gi.repository import GLib
|
||||
return GLib.get_system_config_dirs()
|
||||
|
||||
return data_dirs
|
||||
|
||||
|
||||
def collect_glib_share_files(*path):
|
||||
"""
|
||||
Path is relative to the system data directory (e.g., /usr/share).
|
||||
"""
|
||||
glib_data_dirs = get_glib_system_data_dirs()
|
||||
if glib_data_dirs is None:
|
||||
return []
|
||||
|
||||
destdir = os.path.join('share', *path)
|
||||
|
||||
# TODO: will this return too much?
|
||||
collected = []
|
||||
for data_dir in glib_data_dirs:
|
||||
p = os.path.join(data_dir, *path)
|
||||
collected += collect_system_data_files(p, destdir=destdir, include_py_files=False)
|
||||
|
||||
return collected
|
||||
|
||||
|
||||
def collect_glib_etc_files(*path):
|
||||
"""
|
||||
Path is relative to the system config directory (e.g., /etc).
|
||||
"""
|
||||
glib_config_dirs = get_glib_sysconf_dirs()
|
||||
if glib_config_dirs is None:
|
||||
return []
|
||||
|
||||
destdir = os.path.join('etc', *path)
|
||||
|
||||
# TODO: will this return too much?
|
||||
collected = []
|
||||
for config_dir in glib_config_dirs:
|
||||
p = os.path.join(config_dir, *path)
|
||||
collected += collect_system_data_files(p, destdir=destdir, include_py_files=False)
|
||||
|
||||
return collected
|
||||
|
||||
|
||||
_glib_translations = None
|
||||
|
||||
|
||||
def collect_glib_translations(prog, lang_list=None):
|
||||
"""
|
||||
Return a list of translations in the system locale directory whose names equal prog.mo.
|
||||
"""
|
||||
global _glib_translations
|
||||
if _glib_translations is None:
|
||||
if lang_list is not None:
|
||||
trans = []
|
||||
for lang in lang_list:
|
||||
trans += collect_glib_share_files(os.path.join("locale", lang))
|
||||
_glib_translations = trans
|
||||
else:
|
||||
_glib_translations = collect_glib_share_files('locale')
|
||||
|
||||
names = [os.sep + prog + '.mo', os.sep + prog + '.po']
|
||||
namelen = len(names[0])
|
||||
|
||||
return [(src, dst) for src, dst in _glib_translations if src[-namelen:] in names]
|
||||
|
||||
|
||||
# Not a hook utility function per-se (used by main Analysis class), but kept here to have all GLib/GObject functions
|
||||
# in one place...
|
||||
def compile_glib_schema_files(datas_toc, workdir, collect_source_files=False):
|
||||
"""
|
||||
Compile collected GLib schema files. Extracts the list of GLib schema files from the given input datas TOC, copies
|
||||
them to temporary working directory, and compiles them. The resulting `gschemas.compiled` file is added to the
|
||||
output TOC, replacing any existing entry with that name. If `collect_source_files` flag is set, the source XML
|
||||
schema files are also (re)added to the output TOC; by default, they are not. This function is no-op (returns the
|
||||
original TOC) if no GLib schemas are found in TOC or if `glib-compile-schemas` executable is not found in `PATH`.
|
||||
"""
|
||||
SCHEMA_DEST_DIR = pathlib.PurePath("share/glib-2.0/schemas")
|
||||
workdir = pathlib.Path(workdir)
|
||||
|
||||
schema_files = []
|
||||
output_toc = []
|
||||
for toc_entry in datas_toc:
|
||||
dest_name, src_name, typecode = toc_entry
|
||||
dest_name = pathlib.PurePath(dest_name)
|
||||
src_name = pathlib.PurePath(src_name)
|
||||
|
||||
# Pass-through for non-schema files, identified based on the destination directory.
|
||||
if dest_name.parent != SCHEMA_DEST_DIR:
|
||||
output_toc.append(toc_entry)
|
||||
continue
|
||||
|
||||
# It seems schemas directory contains different files with different suffices:
|
||||
# - .gschema.xml
|
||||
# - .schema.override
|
||||
# - .enums.xml
|
||||
# To avoid omitting anything, simply collect everything into temporary directory.
|
||||
# Exemptions are gschema.dtd (which should be unnecessary) and gschemas.compiled (which we will generate
|
||||
# ourselves in this function).
|
||||
if src_name.name in {"gschema.dtd", "gschemas.compiled"}:
|
||||
continue
|
||||
|
||||
schema_files.append(src_name)
|
||||
|
||||
# If there are no schema files available, simply return the input datas TOC.
|
||||
if not schema_files:
|
||||
return datas_toc
|
||||
|
||||
# Ensure that `glib-compile-schemas` executable is in PATH, just in case...
|
||||
schema_compiler_exe = shutil.which('glib-compile-schemas')
|
||||
if not schema_compiler_exe:
|
||||
logger.warning("GLib schema compiler (glib-compile-schemas) not found! Skipping GLib schema recompilation...")
|
||||
return datas_toc
|
||||
|
||||
# If `gschemas.compiled` file already exists in the temporary working directory, record its modification time and
|
||||
# hash. This will allow us to restore the modification time on the newly-compiled copy, if the latter turns out
|
||||
# to be identical to the existing old one. Just in case, if the file becomes subject to timestamp-based caching
|
||||
# mechanism.
|
||||
compiled_file = workdir / "gschemas.compiled"
|
||||
old_compiled_file_hash = None
|
||||
old_compiled_file_stat = None
|
||||
|
||||
if compiled_file.is_file():
|
||||
# Record creation/modification time
|
||||
old_compiled_file_stat = compiled_file.stat()
|
||||
# Compute SHA1 hash; since compiled schema files are relatively small, do it in single step.
|
||||
old_compiled_file_hash = hashlib.sha1(compiled_file.read_bytes()).digest()
|
||||
|
||||
# Ensure that temporary working directory exists, and is empty.
|
||||
if workdir.exists():
|
||||
shutil.rmtree(workdir)
|
||||
workdir.mkdir(exist_ok=True)
|
||||
|
||||
# Copy schema (source) files to temporary working directory
|
||||
for schema_file in schema_files:
|
||||
shutil.copy(schema_file, workdir)
|
||||
|
||||
# Compile. The glib-compile-schema might produce warnings on its own (e.g., schemas using deprecated paths, or
|
||||
# overrides for non-existent keys). Since these are non-actionable, capture and display them only as a DEBUG
|
||||
# message, or as a WARNING one if the command fails.
|
||||
logger.info("Compiling collected GLib schema files in %r...", str(workdir))
|
||||
try:
|
||||
cmd_args = [schema_compiler_exe, str(workdir), '--targetdir', str(workdir)]
|
||||
p = subprocess.run(
|
||||
cmd_args,
|
||||
stdin=subprocess.DEVNULL,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
check=True,
|
||||
errors='ignore',
|
||||
encoding='utf-8',
|
||||
)
|
||||
logger.debug("Output from glib-compile-schemas:\n%s", p.stdout)
|
||||
except subprocess.CalledProcessError as e:
|
||||
# The called glib-compile-schema returned error. Display stdout/stderr, and return original datas TOC to
|
||||
# minimize damage.
|
||||
logger.warning("Failed to recompile GLib schemas! Returning collected files as-is!", exc_info=True)
|
||||
logger.warning("Output from glib-compile-schemas:\n%s", e.stdout)
|
||||
return datas_toc
|
||||
except Exception:
|
||||
# Compilation failed for whatever reason. Return original datas TOC to minimize damage.
|
||||
logger.warning("Failed to recompile GLib schemas! Returning collected files as-is!", exc_info=True)
|
||||
return datas_toc
|
||||
|
||||
# Compute the checksum of the new compiled file, and if it matches the old checksum, restore the modification time.
|
||||
if old_compiled_file_hash is not None:
|
||||
new_compiled_file_hash = hashlib.sha1(compiled_file.read_bytes()).digest()
|
||||
if new_compiled_file_hash == old_compiled_file_hash:
|
||||
os.utime(compiled_file, ns=(old_compiled_file_stat.st_atime_ns, old_compiled_file_stat.st_mtime_ns))
|
||||
|
||||
# Add the resulting gschemas.compiled file to the output TOC
|
||||
output_toc.append((str(SCHEMA_DEST_DIR / compiled_file.name), str(compiled_file), "DATA"))
|
||||
|
||||
# Include source schema files in the output TOC (optional)
|
||||
if collect_source_files:
|
||||
for schema_file in schema_files:
|
||||
output_toc.append((str(SCHEMA_DEST_DIR / schema_file.name), str(schema_file), "DATA"))
|
||||
|
||||
return output_toc
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,450 @@
|
||||
# ----------------------------------------------------------------------------
|
||||
# Copyright (c) 2022-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
|
||||
# Qt modules information - the core of our Qt collection approach
|
||||
# ----------------------------------------------------------------
|
||||
#
|
||||
# The python bindings for Qt (``PySide2``, ``PyQt5``, ``PySide6``, ``PyQt6``) consist of several python binary extension
|
||||
# modules that provide bindings for corresponding Qt modules. For example, the ``PySide2.QtNetwork`` python extension
|
||||
# module provides bindings for the ``QtNetwork`` Qt module from the ``qt/qtbase`` Qt repository.
|
||||
#
|
||||
# A Qt module can be considered as consisting of:
|
||||
# * a shared library (for example, on Linux, the shared library names for the ``QtNetwork`` Qt module in Qt5 and Qt6
|
||||
# are ``libQt5Network.so`` and ``libQt6Network.so``, respectively).
|
||||
# * plugins: a certain type (or class) of plugins is usually associated with a single Qt module (for example,
|
||||
# ``imageformats`` plugins are associated with the ``QtGui`` Qt module from the ``qt/qtbase`` Qt repository), but
|
||||
# additional plugins of that type may come from other Qt repositories. For example, ``imageformats/qsvg`` plugin
|
||||
# is provided by ``qtsvg/src/plugins/imageformats/svg`` from the ``qt/qtsvg`` repository, and ``imageformats/qpdf``
|
||||
# is provided by ``qtwebengine/src/pdf/plugins/imageformats/pdf`` from the ``qt/qtwebengine`` repository.
|
||||
# * translation files: names of translation files consist of a base name, which typically corresponds to the Qt
|
||||
# repository name, and language code. A single translation file usually covers all Qt modules contained within
|
||||
# the same repository. For example, translation files with base name ``qtbase`` contain translations for ``QtCore``,
|
||||
# ``QtGui``, ``QtWidgets``, ``QtNetwork``, and other Qt modules from the ``qt/qtbase`` Qt repository.
|
||||
#
|
||||
# The PyInstaller's built-in analysis of link-time dependencies ensures that when collecting a Qt python extension
|
||||
# module, we automatically pick up the linked Qt shared libraries. However, collection of linked Qt shared libraries
|
||||
# does not result in collection of plugins, nor translation files. In addition, the dependency of a Qt python extension
|
||||
# module on other Qt python extension modules (i.e., at the bindings level) cannot be automatically determined due to
|
||||
# PyInstaller's inability to scan imports in binary extensions.
|
||||
#
|
||||
# PyInstaller < 5.7 solved this problem using a dictionary that associated a Qt shared library name with python
|
||||
# extension name, plugins, and translation files. For each hooked Qt python extension module, the hook calls a helper
|
||||
# that analyzes the extension file for link-time dependencies, and matches those against the dictionary. Therefore,
|
||||
# based on linked shared libraries, we could recursively infer the list of files to collect in addition to the shared
|
||||
# libraries themselves:
|
||||
# - plugins and translation files belonging to Qt modules whose shared libraries we collect
|
||||
# - Qt python extension modules corresponding to the Qt modules that we collect
|
||||
#
|
||||
# The above approach ensures that even if analyzed python script contains only ``from PySide2 import QtWidgets``,
|
||||
# we would also collect ``PySide2.QtGui`` and ``PySide2.QtCore``, as well as all corresponding Qt module files
|
||||
# (the shared libraries, plugins, translation files). For this to work, a hook must be provided for the
|
||||
# ``PySide2.QtWidgets`` that performs the recursive analysis of the extension module file; so to ensure that each
|
||||
# Qt python extension module by itself ensures collection of all its dependencies, we need to hook all Qt python
|
||||
# extension modules provided by specific python Qt bindings package.
|
||||
#
|
||||
# The above approach with single dictionary, however, has several limitations:
|
||||
# - it cannot provide association for Qt python module that binds a Qt module without a shared library (i.e., a
|
||||
# headers-only module, or a statically-built module). In such cases, potential plugins and translations should
|
||||
# be associated directly with the Qt python extension file instead of the Qt module's (non-existent) shared library.
|
||||
# - it cannot (directly) handle differences between Qt5 and Qt6; we had to build a second dictionary
|
||||
# - it cannot handle differences between the bindings themselves; for example, PyQt5 binds some Qt modules that
|
||||
# PySide2 does not bind. Or, the binding's Qt python extension module is named differently in PyQt and PySide
|
||||
# bindings (or just differently in PyQt5, while PySide2, PySide6, and PyQt6 use the same name).
|
||||
#
|
||||
# In order address the above shortcomings, we now store all information a list of structures that contain information
|
||||
# for a particular Qt python extension and/or Qt module (shared library):
|
||||
# - python extension name (if applicable)
|
||||
# - Qt module name base (if applicable)
|
||||
# - plugins
|
||||
# - translation files base name
|
||||
# - applicable Qt version (if necessary)
|
||||
# - applicable Qt bindings (if necessary)
|
||||
#
|
||||
# This list is used to dynamically construct two dictionaries (based on the bindings name and Qt version):
|
||||
# - mapping python extension names to associated module information
|
||||
# - mapping Qt shared library names to associated module information
|
||||
# This allows us to associate plugins and translations with either Qt python extension or with the Qt module's shared
|
||||
# library (or both), whichever is applicable.
|
||||
#
|
||||
# The `qt_dynamic_dependencies_dict`_ from the original approach was constructed using several information sources, as
|
||||
# documented `here
|
||||
# <https://github.com/pyinstaller/pyinstaller/blob/fbf7948be85177dd44b41217e9f039e1d176de6b/PyInstaller/utils/hooks/qt.py#L266-L362>`_.
|
||||
#
|
||||
# In the current approach, the relations stored in the `QT_MODULES_INFO`_ list were determined directly, by inspecting
|
||||
# the Qt source code. This requires some prior knowledge of how the Qt code is organized (repositories and individual Qt
|
||||
# modules within them), as well as some searching based on guesswork. The procedure can be outlined as follows:
|
||||
# * check out the `main Qt repository <git://code.qt.io/qt/qt5.git>`_. This repository contains references to all other
|
||||
# Qt repositories in the form of git submodules.
|
||||
# * for Qt5:
|
||||
# * check out the latest release tag, e.g., v5.15.2, then check out the submodules.
|
||||
# * search the Qt modules' qmake .pro files; for example, ``qtbase/src/network/network.pro`` for QtNetwork module.
|
||||
# The plugin types associated with the module are listed in the ``MODULE_PLUGIN_TYPES`` variable (in this case,
|
||||
# ``bearer``).
|
||||
# * all translations are gathered in ``qttranslations`` sub-module/repository, and their association with
|
||||
# individual repositories can be seen in ``qttranslations/translations/translations.pro``.
|
||||
# * for Qt6:
|
||||
# * check out the latest release tag, e.g., v6.3.1, then check out the submodules.
|
||||
# * search the Qt modules' CMake files; for example, ``qtbase/src/network/CMakeLists.txt`` for QtNetwork module.
|
||||
# The plugin types associated with the module are listed under ``PLUGIN_TYPES`` argument of the
|
||||
# ``qt_internal_add_module()`` function that defines the Qt module.
|
||||
#
|
||||
# The idea is to make a list of all extension modules found in a Qt bindings package, as well as all available plugin
|
||||
# directories (which correspond to plugin types) and translation files. For each extension, identify the corresponding
|
||||
# Qt module (shared library name) and its associated plugins and translation files. Once this is done, most of available
|
||||
# plugins and translations in the python bindings package should have a corresponding python Qt extension module
|
||||
# available; this gives us associations based on the python extension module names as well as based on the Qt shared
|
||||
# library names. For any plugins and translation files remaining unassociated, identify the corresponding Qt module;
|
||||
# this gives us associations based only on Qt shared library names. While this second group of associations are never
|
||||
# processed directly (due to lack of corresponding python extension), they may end up being processed during the
|
||||
# recursive dependency analysis, if the corresponding Qt shared library is linked against by some Qt python extension
|
||||
# or another Qt shared library.
|
||||
|
||||
|
||||
# This structure is used to define Qt module information, such as python module/extension name, Qt module (shared
|
||||
# library) name, translation files' base names, plugins, as well as associated python bindings (which implicitly
|
||||
# also encode major Qt version).
|
||||
class _QtModuleDef:
|
||||
def __init__(self, module, shared_lib=None, translations=None, plugins=None, bindings=None):
|
||||
# Python module (extension) name without package namespace. For example, `QtCore`.
|
||||
# Can be None if python bindings do not bind the module, but we still need to establish relationship between
|
||||
# the Qt module (shared library) and its plugins and translations.
|
||||
self.module = module
|
||||
# Associated Qt module (shared library), if any. Used during recursive dependency analysis, where a python
|
||||
# module (extension) is analyzed for linked Qt modules (shared libraries), and then their corresponding
|
||||
# python modules (extensions) are added to hidden imports. For example, the Qt module name is `Qt5Core` or
|
||||
# `Qt6Core`, depending on the Qt version. Can be None for python modules that are not tied to a particular
|
||||
# Qt shared library (for example, the corresponding Qt module is headers-only) and hence they cannot be
|
||||
# inferred from recursive link-time dependency analysis.
|
||||
self.shared_lib = shared_lib
|
||||
# List of base names of translation files (if any) associated with the Qt module. Multiple base names may be
|
||||
# associated with a single module.
|
||||
# For example, `['qt', 'qtbase']` for `QtCore` or `['qtmultimedia']` for `QtMultimedia`.
|
||||
self.translations = translations or []
|
||||
# List of plugins associated with the Qt module.
|
||||
self.plugins = plugins or []
|
||||
# List of bindings (PySide2, PyQt5, PySide6, PyQt6) that provide the python module. This allows association of
|
||||
# plugins and translations with shared libraries even for bindings that do not provide python module binding
|
||||
# for the Qt module.
|
||||
self.bindings = set(bindings or [])
|
||||
|
||||
|
||||
# All Qt-based bindings.
|
||||
ALL_QT_BINDINGS = {"PySide2", "PyQt5", "PySide6", "PyQt6"}
|
||||
|
||||
# Qt modules information - the core of our Qt collection approach.
|
||||
#
|
||||
# For every python module/extension (i.e., entry in the list below that has valid `module`), we need a corresponding
|
||||
# hook, ensuring that the extension file is analyzed, so that we collect the associated plugins and translation
|
||||
# files, as well as perform recursive analysis of link-time binary dependencies (so that plugins and translation files
|
||||
# belonging to those dependencies are collected as well).
|
||||
QT_MODULES_INFO = (
|
||||
# *** qt/qt3d ***
|
||||
_QtModuleDef("Qt3DAnimation", shared_lib="3DAnimation"),
|
||||
_QtModuleDef("Qt3DCore", shared_lib="3DCore"),
|
||||
_QtModuleDef("Qt3DExtras", shared_lib="3DExtras"),
|
||||
_QtModuleDef("Qt3DInput", shared_lib="3DInput", plugins=["3dinputdevices"]),
|
||||
_QtModuleDef("Qt3DLogic", shared_lib="3DLogic"),
|
||||
_QtModuleDef(
|
||||
"Qt3DRender", shared_lib="3DRender", plugins=["geometryloaders", "renderplugins", "renderers", "sceneparsers"]
|
||||
),
|
||||
|
||||
# *** qt/qtactiveqt ***
|
||||
# The python module is called QAxContainer in PyQt bindings, but QtAxContainer in PySide. The associated Qt module
|
||||
# is header-only, so there is no shared library.
|
||||
_QtModuleDef("QAxContainer", bindings=["PyQt*"]),
|
||||
_QtModuleDef("QtAxContainer", bindings=["PySide*"]),
|
||||
|
||||
# *** qt/qtcharts ***
|
||||
# The python module is called QtChart in PyQt5, and QtCharts in PySide2, PySide6, and PyQt6 (which corresponds to
|
||||
# the associated Qt module name, QtCharts).
|
||||
_QtModuleDef("QtChart", shared_lib="Charts", bindings=["PyQt5"]),
|
||||
_QtModuleDef("QtCharts", shared_lib="Charts", bindings=["!PyQt5"]),
|
||||
|
||||
# *** qt/qtbase ***
|
||||
# QtConcurrent python module is available only in PySide bindings.
|
||||
_QtModuleDef(None, shared_lib="Concurrent", bindings=["PyQt*"]),
|
||||
_QtModuleDef("QtConcurrent", shared_lib="Concurrent", bindings=["PySide*"]),
|
||||
_QtModuleDef("QtCore", shared_lib="Core", translations=["qt", "qtbase"]),
|
||||
# QtDBus python module is available in all bindings but PySide2.
|
||||
_QtModuleDef(None, shared_lib="DBus", bindings=["PySide2"]),
|
||||
_QtModuleDef("QtDBus", shared_lib="DBus", bindings=["!PySide2"]),
|
||||
# QtNetwork uses different plugins in Qt5 and Qt6.
|
||||
_QtModuleDef("QtNetwork", shared_lib="Network", plugins=["bearer"], bindings=["PySide2", "PyQt5"]),
|
||||
_QtModuleDef(
|
||||
"QtNetwork",
|
||||
shared_lib="Network",
|
||||
plugins=["networkaccess", "networkinformation", "tls"],
|
||||
bindings=["PySide6", "PyQt6"]
|
||||
),
|
||||
_QtModuleDef(
|
||||
"QtGui",
|
||||
shared_lib="Gui",
|
||||
plugins=[
|
||||
"accessiblebridge",
|
||||
"egldeviceintegrations",
|
||||
"generic",
|
||||
"iconengines",
|
||||
"imageformats",
|
||||
"platforms",
|
||||
"platforms/darwin",
|
||||
"platforminputcontexts",
|
||||
"platformthemes",
|
||||
"xcbglintegrations",
|
||||
# The ``wayland-*`` plugins are part of QtWaylandClient Qt module, whose shared library
|
||||
# (e.g., libQt5WaylandClient.so) is linked by the wayland-related ``platforms`` plugins. Ideally, we would
|
||||
# collect these plugins based on the QtWaylandClient shared library entry, but as our Qt hook utilities do
|
||||
# not scan the plugins for dependencies, that would not work. So instead we list these plugins under QtGui
|
||||
# to achieve pretty much the same end result.
|
||||
"wayland-decoration-client",
|
||||
"wayland-graphics-integration-client",
|
||||
"wayland-shell-integration"
|
||||
]
|
||||
),
|
||||
_QtModuleDef("QtOpenGL", shared_lib="OpenGL"),
|
||||
# This python module is specific to PySide2 and has no associated Qt module.
|
||||
_QtModuleDef("QtOpenGLFunctions", bindings=["PySide2"]),
|
||||
# This Qt module was introduced with Qt6.
|
||||
_QtModuleDef("QtOpenGLWidgets", shared_lib="OpenGLWidgets", bindings=["PySide6", "PyQt6"]),
|
||||
_QtModuleDef("QtPrintSupport", shared_lib="PrintSupport", plugins=["printsupport"]),
|
||||
_QtModuleDef("QtSql", shared_lib="Sql", plugins=["sqldrivers"]),
|
||||
_QtModuleDef("QtTest", shared_lib="Test"),
|
||||
_QtModuleDef("QtWidgets", shared_lib="Widgets", plugins=["styles"]),
|
||||
_QtModuleDef("QtXml", shared_lib="Xml"),
|
||||
|
||||
# *** qt/qtconnectivity ***
|
||||
_QtModuleDef("QtBluetooth", shared_lib="QtBluetooth", translations=["qtconnectivity"]),
|
||||
_QtModuleDef("QtNfc", shared_lib="Nfc", translations=["qtconnectivity"]),
|
||||
|
||||
# *** qt/qtdatavis3d ***
|
||||
_QtModuleDef("QtDataVisualization", shared_lib="DataVisualization"),
|
||||
|
||||
# *** qt/qtdeclarative ***
|
||||
_QtModuleDef("QtQml", shared_lib="Qml", translations=["qtdeclarative"], plugins=["qmltooling"]),
|
||||
# Have the Qt5 variant collect translations for qtquickcontrols (qt/qtquickcontrols provides only QtQuick plugins).
|
||||
_QtModuleDef(
|
||||
"QtQuick",
|
||||
shared_lib="Quick",
|
||||
translations=["qtquickcontrols"],
|
||||
plugins=["scenegraph"],
|
||||
bindings=["PySide2", "PyQt5"]
|
||||
),
|
||||
_QtModuleDef("QtQuick", shared_lib="Quick", plugins=["scenegraph"], bindings=["PySide6", "PyQt6"]),
|
||||
# Qt6-only; in Qt5, this module is part of qt/qtquickcontrols2. Python module is available only in PySide6.
|
||||
_QtModuleDef(None, shared_lib="QuickControls2", bindings=["PyQt6"]),
|
||||
_QtModuleDef("QtQuickControls2", shared_lib="QuickControls2", bindings=["PySide6"]),
|
||||
_QtModuleDef("QtQuickWidgets", shared_lib="QuickWidgets"),
|
||||
|
||||
# *** qt/qtgamepad ***
|
||||
# No python module; shared library -> plugins association entry.
|
||||
_QtModuleDef(None, shared_lib="Gamepad", plugins=["gamepads"]),
|
||||
|
||||
# *** qt/qtgraphs ***
|
||||
# Qt6 >= 6.6.0; python module is available only in PySide6.
|
||||
_QtModuleDef("QtGraphs", shared_lib="Graphs", bindings=["PySide6"]),
|
||||
|
||||
# *** qt/qthttpserver ***
|
||||
# Qt6 >= 6.4.0; python module is available only in PySide6.
|
||||
_QtModuleDef("QtHttpServer", shared_lib="HttpServer", bindings=["PySide6"]),
|
||||
|
||||
# *** qt/qtlocation ***
|
||||
# QtLocation was reintroduced in Qt6 v6.5.0.
|
||||
_QtModuleDef(
|
||||
"QtLocation",
|
||||
shared_lib="Location",
|
||||
translations=["qtlocation"],
|
||||
plugins=["geoservices"],
|
||||
bindings=["PySide2", "PyQt5", "PySide6"]
|
||||
),
|
||||
_QtModuleDef(
|
||||
"QtPositioning",
|
||||
shared_lib="Positioning",
|
||||
translations=["qtlocation"],
|
||||
plugins=["position"],
|
||||
),
|
||||
|
||||
# *** qt/qtmacextras ***
|
||||
# Qt5-only Qt module.
|
||||
_QtModuleDef("QtMacExtras", shared_lib="MacExtras", bindings=["PySide2", "PyQt5"]),
|
||||
|
||||
# *** qt/qtmultimedia ***
|
||||
# QtMultimedia on Qt6 currently uses only a subset of plugin names from Qt5 counterpart.
|
||||
_QtModuleDef(
|
||||
"QtMultimedia",
|
||||
shared_lib="Multimedia",
|
||||
translations=["qtmultimedia"],
|
||||
plugins=[
|
||||
"mediaservice", "audio", "video/bufferpool", "video/gstvideorenderer", "video/videonode", "playlistformats",
|
||||
"resourcepolicy"
|
||||
],
|
||||
bindings=["PySide2", "PyQt5"]
|
||||
),
|
||||
_QtModuleDef(
|
||||
"QtMultimedia",
|
||||
shared_lib="Multimedia",
|
||||
translations=["qtmultimedia"],
|
||||
# `multimedia` plugins are available as of Qt6 >= 6.4.0; earlier versions had `video/gstvideorenderer` and
|
||||
# `video/videonode` plugins.
|
||||
plugins=["multimedia", "video/gstvideorenderer", "video/videonode"],
|
||||
bindings=["PySide6", "PyQt6"]
|
||||
),
|
||||
_QtModuleDef("QtMultimediaWidgets", shared_lib="MultimediaWidgets"),
|
||||
# Qt6-only Qt module; python module is available in PySide6 >= 6.4.0 and PyQt6 >= 6.5.0
|
||||
_QtModuleDef("QtSpatialAudio", shared_lib="SpatialAudio", bindings=["PySide6", "PyQt6"]),
|
||||
|
||||
# *** qt/qtnetworkauth ***
|
||||
# QtNetworkAuth python module is available in all bindings but PySide2.
|
||||
_QtModuleDef(None, shared_lib="NetworkAuth", bindings=["PySide2"]),
|
||||
_QtModuleDef("QtNetworkAuth", shared_lib="NetworkAuth", bindings=["!PySide2"]),
|
||||
|
||||
# *** qt/qtpurchasing ***
|
||||
# Qt5-only Qt module, python module is available only in PyQt5.
|
||||
_QtModuleDef("QtPurchasing", shared_lib="Purchasing", bindings=["PyQt5"]),
|
||||
|
||||
# *** qt/qtquick1 ***
|
||||
# This is an old, Qt 5.3-era module...
|
||||
_QtModuleDef(
|
||||
"QtDeclarative",
|
||||
shared_lib="Declarative",
|
||||
translations=["qtquick1"],
|
||||
plugins=["qml1tooling"],
|
||||
bindings=["PySide2", "PyQt5"]
|
||||
),
|
||||
|
||||
# *** qt/qtquick3d ***
|
||||
# QtQuick3D python module is available in all bindings but PySide2.
|
||||
_QtModuleDef(None, shared_lib="Quick3D", bindings=["PySide2"]),
|
||||
_QtModuleDef("QtQuick3D", shared_lib="Quick3D", bindings=["!PySide2"]),
|
||||
# No python module; shared library -> plugins association entry.
|
||||
_QtModuleDef(None, shared_lib="Quick3DAssetImport", plugins=["assetimporters"]),
|
||||
|
||||
# *** qt/qtquickcontrols2 ***
|
||||
# Qt5-only module; in Qt6, this module is part of qt/declarative. Python module is available only in PySide2.
|
||||
_QtModuleDef(None, translations=["qtquickcontrols2"], shared_lib="QuickControls2", bindings=["PyQt5"]),
|
||||
_QtModuleDef(
|
||||
"QtQuickControls2", translations=["qtquickcontrols2"], shared_lib="QuickControls2", bindings=["PySide2"]
|
||||
),
|
||||
|
||||
# *** qt/qtremoteobjects ***
|
||||
_QtModuleDef("QtRemoteObjects", shared_lib="RemoteObjects"),
|
||||
|
||||
# *** qt/qtscxml ***
|
||||
# Python module is available only in PySide bindings. Plugins are available only in Qt6.
|
||||
# PyQt wheels do not seem to ship the corresponding Qt modules (shared libs) at all.
|
||||
_QtModuleDef("QtScxml", shared_lib="Scxml", bindings=["PySide2"]),
|
||||
_QtModuleDef("QtScxml", shared_lib="Scxml", plugins=["scxmldatamodel"], bindings=["PySide6"]),
|
||||
# Qt6-only Qt module, python module is available only in PySide6.
|
||||
_QtModuleDef("QtStateMachine", shared_lib="StateMachine", bindings=["PySide6"]),
|
||||
|
||||
# *** qt/qtsensors ***
|
||||
_QtModuleDef("QtSensors", shared_lib="Sensors", plugins=["sensors", "sensorgestures"]),
|
||||
|
||||
# *** qt/qtserialport ***
|
||||
_QtModuleDef("QtSerialPort", shared_lib="SerialPort", translations=["qtserialport"]),
|
||||
|
||||
# *** qt/qtscript ***
|
||||
# Qt5-only Qt module, python module is available only in PySide2. PyQt5 wheels do not seem to ship the corresponding
|
||||
# Qt modules (shared libs) at all.
|
||||
_QtModuleDef("QtScript", shared_lib="Script", translations=["qtscript"], plugins=["script"], bindings=["PySide2"]),
|
||||
_QtModuleDef("QtScriptTools", shared_lib="ScriptTools", bindings=["PySide2"]),
|
||||
|
||||
# *** qt/qtserialbus ***
|
||||
# No python module; shared library -> plugins association entry.
|
||||
# PySide6 6.5.0 introduced python module.
|
||||
_QtModuleDef(None, shared_lib="SerialBus", plugins=["canbus"], bindings=["!PySide6"]),
|
||||
_QtModuleDef("QtSerialBus", shared_lib="SerialBus", plugins=["canbus"], bindings=["PySide6"]),
|
||||
|
||||
# *** qt/qtsvg ***
|
||||
_QtModuleDef("QtSvg", shared_lib="Svg"),
|
||||
# Qt6-only Qt module.
|
||||
_QtModuleDef("QtSvgWidgets", shared_lib="SvgWidgets", bindings=["PySide6", "PyQt6"]),
|
||||
|
||||
# *** qt/qtspeech ***
|
||||
_QtModuleDef("QtTextToSpeech", shared_lib="TextToSpeech", plugins=["texttospeech"]),
|
||||
|
||||
# *** qt/qttools ***
|
||||
# QtDesigner python module is available in all bindings but PySide2.
|
||||
_QtModuleDef(None, shared_lib="Designer", plugins=["designer"], bindings=["PySide2"]),
|
||||
_QtModuleDef(
|
||||
"QtDesigner", shared_lib="Designer", translations=["designer"], plugins=["designer"], bindings=["!PySide2"]
|
||||
),
|
||||
_QtModuleDef("QtHelp", shared_lib="Help", translations=["qt_help"]),
|
||||
# Python module is available only in PySide bindings.
|
||||
_QtModuleDef("QtUiTools", shared_lib="UiTools", bindings=["PySide*"]),
|
||||
|
||||
# *** qt/qtvirtualkeyboard ***
|
||||
# No python module; shared library -> plugins association entry.
|
||||
_QtModuleDef(None, shared_lib="VirtualKeyboard", plugins=["virtualkeyboard"]),
|
||||
|
||||
# *** qt/qtwebchannel ***
|
||||
_QtModuleDef("QtWebChannel", shared_lib="WebChannel"),
|
||||
|
||||
# *** qt/qtwebengine ***
|
||||
# QtWebEngine is Qt5-only module (replaced by QtWebEngineQuick in Qt6).
|
||||
_QtModuleDef("QtWebEngine", shared_lib="WebEngine", bindings=["PySide2", "PyQt5"]),
|
||||
_QtModuleDef("QtWebEngineCore", shared_lib="WebEngineCore", translations=["qtwebengine"]),
|
||||
# QtWebEngineQuick is Qt6-only module (replacement for QtWebEngine in Qt5).
|
||||
_QtModuleDef("QtWebEngineQuick", shared_lib="WebEngineQuick", bindings=["PySide6", "PyQt6"]),
|
||||
_QtModuleDef("QtWebEngineWidgets", shared_lib="WebEngineWidgets"),
|
||||
# QtPdf and QtPdfWidgets have python module available in PySide6 and PyQt6 >= 6.4.0.
|
||||
_QtModuleDef("QtPdf", shared_lib="Pdf", bindings=["PySide6", "PyQt6"]),
|
||||
_QtModuleDef("QtPdfWidgets", shared_lib="PdfWidgets", bindings=["PySide6", "PyQt6"]),
|
||||
|
||||
# *** qt/qtwebsockets ***
|
||||
_QtModuleDef("QtWebSockets", shared_lib="WebSockets", translations=["qtwebsockets"]),
|
||||
|
||||
# *** qt/qtwebview ***
|
||||
# No python module; shared library -> plugins association entry.
|
||||
_QtModuleDef(None, shared_lib="WebView", plugins=["webview"]),
|
||||
|
||||
# *** qt/qtwinextras ***
|
||||
# Qt5-only Qt module.
|
||||
_QtModuleDef("QtWinExtras", shared_lib="WinExtras", bindings=["PySide2", "PyQt5"]),
|
||||
|
||||
# *** qt/qtx11extras ***
|
||||
# Qt5-only Qt module.
|
||||
_QtModuleDef("QtX11Extras", shared_lib="X11Extras", bindings=["PySide2", "PyQt5"]),
|
||||
|
||||
# *** qt/qtxmlpatterns ***
|
||||
# Qt5-only Qt module.
|
||||
_QtModuleDef(
|
||||
"QtXmlPatterns", shared_lib="XmlPatterns", translations=["qtxmlpatterns"], bindings=["PySide2", "PyQt5"]
|
||||
),
|
||||
|
||||
# *** qscintilla ***
|
||||
# Python module is available only in PyQt bindings. No associated shared library.
|
||||
_QtModuleDef("Qsci", translations=["qscintilla"], bindings=["PyQt*"]),
|
||||
)
|
||||
|
||||
|
||||
# Helpers for turning Qt namespace specifiers, such as "!PySide2" or "PyQt*", into set of applicable
|
||||
# namespaces.
|
||||
def process_namespace_strings(namespaces):
|
||||
""""Process list of Qt namespace specifier strings into set of namespaces."""
|
||||
bindings = set()
|
||||
for namespace in namespaces:
|
||||
bindings |= _process_namespace_string(namespace)
|
||||
return bindings
|
||||
|
||||
|
||||
def _process_namespace_string(namespace):
|
||||
"""Expand a Qt namespace specifier string into set of namespaces."""
|
||||
if namespace.startswith("!"):
|
||||
bindings = _process_namespace_string(namespace[1:])
|
||||
return ALL_QT_BINDINGS - bindings
|
||||
else:
|
||||
if namespace == "PySide*":
|
||||
return {"PySide2", "PySide6"}
|
||||
elif namespace == "PyQt*":
|
||||
return {"PyQt5", "PyQt6"}
|
||||
elif namespace in ALL_QT_BINDINGS:
|
||||
return {namespace}
|
||||
else:
|
||||
raise ValueError(f"Invalid Qt namespace specifier: {namespace}!")
|
||||
@@ -0,0 +1,256 @@
|
||||
# ----------------------------------------------------------------------------
|
||||
# Copyright (c) 2024, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
from PyInstaller import log as logging
|
||||
from PyInstaller import isolated
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# Import setuptools and analyze its properties in an isolated subprocess. This function is called by `SetuptoolsInfo`
|
||||
# to initialize its properties.
|
||||
@isolated.decorate
|
||||
def _retrieve_setuptools_info():
|
||||
import importlib
|
||||
|
||||
try:
|
||||
setuptools = importlib.import_module("setuptools") # noqa: F841
|
||||
except ModuleNotFoundError:
|
||||
return None
|
||||
|
||||
# Delay these imports until after we have confirmed that setuptools is importable.
|
||||
import pathlib
|
||||
|
||||
import packaging.version
|
||||
|
||||
from PyInstaller.compat import importlib_metadata
|
||||
from PyInstaller.utils.hooks import (
|
||||
collect_data_files,
|
||||
collect_submodules,
|
||||
)
|
||||
|
||||
# Try to retrieve the version. At this point, failure is consider an error.
|
||||
version_string = importlib_metadata.version("setuptools")
|
||||
version = packaging.version.Version(version_string).release # Use the version tuple
|
||||
|
||||
# setuptools >= 60.0 its vendored copy of distutils (mainly due to its removal from stdlib in python >= 3.12).
|
||||
distutils_vendored = False
|
||||
distutils_modules = []
|
||||
if version >= (60, 0):
|
||||
distutils_vendored = True
|
||||
distutils_modules += ["_distutils_hack"]
|
||||
distutils_modules += collect_submodules(
|
||||
"setuptools._distutils",
|
||||
# setuptools 71.0.1 ~ 71.0.4 include `setuptools._distutils.tests`; avoid explicitly collecting it
|
||||
# (t was not included in earlier setuptools releases).
|
||||
filter=lambda name: name != 'setuptools._distutils.tests',
|
||||
)
|
||||
|
||||
# Check if `setuptools._vendor` exists. Some linux distributions opt to de-vendor `setuptools` and remove the
|
||||
# `setuptools._vendor` directory altogether. If this is the case, most of additional processing below should be
|
||||
# skipped to avoid errors and warnings about non-existent `setuptools._vendor` module.
|
||||
try:
|
||||
setuptools_vendor = importlib.import_module("setuptools._vendor")
|
||||
except ModuleNotFoundError:
|
||||
setuptools_vendor = None
|
||||
|
||||
# Check for exposed packages/modules that are vendored by setuptools. If stand-alone version is not provided in the
|
||||
# environment, setuptools-vendored version is exposed (due to location of `setuptools._vendor` being appended to
|
||||
# `sys.path`. Applicable to v71.0.0 and later.
|
||||
vendored_status = dict()
|
||||
if version >= (71, 0) and setuptools_vendor is not None:
|
||||
VENDORED_CANDIDATES = (
|
||||
"autocommand",
|
||||
"backports.tarfile",
|
||||
"importlib_metadata",
|
||||
"importlib_resources",
|
||||
"inflect",
|
||||
"jaraco.context",
|
||||
"jaraco.functools",
|
||||
"jaraco.text",
|
||||
"more_itertools",
|
||||
"ordered_set",
|
||||
"packaging",
|
||||
"platformdirs",
|
||||
"tomli",
|
||||
"typeguard",
|
||||
"typing_extensions",
|
||||
"wheel",
|
||||
"zipp",
|
||||
)
|
||||
|
||||
# Resolve path(s) of `setuptools_vendor` package.
|
||||
setuptools_vendor_paths = [pathlib.Path(path).resolve() for path in setuptools_vendor.__path__]
|
||||
|
||||
# Process each candidate
|
||||
for candidate_name in VENDORED_CANDIDATES:
|
||||
try:
|
||||
candidate = importlib.import_module(candidate_name)
|
||||
except ImportError:
|
||||
continue
|
||||
|
||||
# Check the __file__ attribute (modules and regular packages). Will not work with namespace packages, but
|
||||
# at the moment, there are none.
|
||||
candidate_file_attr = getattr(candidate, '__file__', None)
|
||||
if candidate_file_attr is not None:
|
||||
candidate_path = pathlib.Path(candidate_file_attr).parent.resolve()
|
||||
is_vendored = any([
|
||||
setuptools_vendor_path in candidate_path.parents or candidate_path == setuptools_vendor_path
|
||||
for setuptools_vendor_path in setuptools_vendor_paths
|
||||
])
|
||||
vendored_status[candidate_name] = is_vendored
|
||||
|
||||
# Collect submodules from `setuptools._vendor`, regardless of whether the vendored package is exposed or
|
||||
# not (because setuptools might need/use it either way).
|
||||
vendored_modules = []
|
||||
if setuptools_vendor is not None:
|
||||
EXCLUDED_VENDORED_MODULES = (
|
||||
# Prevent recursing into setuptools._vendor.pyparsing.diagram, which typically fails to be imported due
|
||||
# to missing dependencies (railroad, pyparsing (?), jinja2) and generates a warning... As the module is
|
||||
# usually unimportable, it is likely not to be used by setuptools. NOTE: pyparsing was removed from
|
||||
# vendored packages in setuptools v67.0.0; keep this exclude around for earlier versions.
|
||||
'setuptools._vendor.pyparsing.diagram',
|
||||
# Setuptools >= 71 started shipping vendored dependencies that include tests; avoid collecting those via
|
||||
# hidden imports. (Note that this also prevents creation of aliases for these module, but that should
|
||||
# not be an issue, as they should not be referenced from anywhere).
|
||||
'setuptools._vendor.importlib_resources.tests',
|
||||
# These appear to be utility scripts bundled with the jaraco.text package - exclude them.
|
||||
'setuptools._vendor.jaraco.text.show-newlines',
|
||||
'setuptools._vendor.jaraco.text.strip-prefix',
|
||||
'setuptools._vendor.jaraco.text.to-dvorak',
|
||||
'setuptools._vendor.jaraco.text.to-qwerty',
|
||||
)
|
||||
vendored_modules += collect_submodules(
|
||||
'setuptools._vendor',
|
||||
filter=lambda name: name not in EXCLUDED_VENDORED_MODULES,
|
||||
)
|
||||
|
||||
# `collect_submodules` (and its underlying `pkgutil.iter_modules` do not discover namespace sub-packages, in
|
||||
# this case `setuptools._vendor.jaraco`. So force a manual scan of modules/packages inside it.
|
||||
vendored_modules += collect_submodules(
|
||||
'setuptools._vendor.jaraco',
|
||||
filter=lambda name: name not in EXCLUDED_VENDORED_MODULES,
|
||||
)
|
||||
|
||||
# *** Data files for vendored packages ***
|
||||
vendored_data = []
|
||||
|
||||
if version >= (71, 0) and setuptools_vendor is not None:
|
||||
# Since the vendored dependencies from `setuptools/_vendor` are now visible to the outside world, make
|
||||
# sure we collect their metadata. (We cannot use copy_metadata here, because we need to collect data
|
||||
# files to their original locations).
|
||||
vendored_data += collect_data_files('setuptools._vendor', includes=['**/*.dist-info'])
|
||||
# Similarly, ensure that `Lorem ipsum.txt` from vendored jaraco.text is collected
|
||||
vendored_data += collect_data_files('setuptools._vendor.jaraco.text', includes=['**/Lorem ipsum.txt'])
|
||||
|
||||
# Return dictionary with collected information
|
||||
return {
|
||||
"available": True,
|
||||
"version": version,
|
||||
"distutils_vendored": distutils_vendored,
|
||||
"distutils_modules": distutils_modules,
|
||||
"vendored_status": vendored_status,
|
||||
"vendored_modules": vendored_modules,
|
||||
"vendored_data": vendored_data,
|
||||
}
|
||||
|
||||
|
||||
class SetuptoolsInfo:
|
||||
def __init__(self):
|
||||
pass
|
||||
|
||||
def __repr__(self):
|
||||
return "SetuptoolsInfo"
|
||||
|
||||
# Delay initialization of setuptools information until until the corresponding attributes are first requested.
|
||||
def __getattr__(self, name):
|
||||
if 'available' in self.__dict__:
|
||||
# Initialization was already done, but requested attribute is not available.
|
||||
raise AttributeError(name)
|
||||
|
||||
# Load setuptools info...
|
||||
self._load_setuptools_info()
|
||||
# ... and return the requested attribute
|
||||
return getattr(self, name)
|
||||
|
||||
def _load_setuptools_info(self):
|
||||
logger.info("%s: initializing cached setuptools info...", self)
|
||||
|
||||
# Initialize variables so that they might be accessed even if setuptools is unavailable or if initialization
|
||||
# fails for some reason.
|
||||
self.available = False
|
||||
self.version = None
|
||||
self.distutils_vendored = False
|
||||
self.distutils_modules = []
|
||||
self.vendored_status = dict()
|
||||
self.vendored_modules = []
|
||||
self.vendored_data = []
|
||||
|
||||
try:
|
||||
setuptools_info = _retrieve_setuptools_info()
|
||||
except Exception as e:
|
||||
logger.warning("%s: failed to obtain setuptools info: %s", self, e)
|
||||
return
|
||||
|
||||
# If package could not be imported, `_retrieve_setuptools_info` returns None. In such cases, emit a debug
|
||||
# message instead of a warning, because this initialization might be triggered by a helper function that is
|
||||
# trying to determine availability of `setuptools` by inspecting the `available` attribute.
|
||||
if setuptools_info is None:
|
||||
logger.debug("%s: failed to obtain setuptools info: setuptools could not be imported.", self)
|
||||
return
|
||||
|
||||
# Copy properties
|
||||
for key, value in setuptools_info.items():
|
||||
setattr(self, key, value)
|
||||
|
||||
def is_vendored(self, module_name):
|
||||
return self.vendored_status.get(module_name, False)
|
||||
|
||||
@staticmethod
|
||||
def _create_vendored_aliases(vendored_name, module_name, modules_list):
|
||||
# Create aliases for all submodules
|
||||
prefix_len = len(vendored_name) # Length of target-name prefix to remove
|
||||
return ((module_name + vendored_module[prefix_len:], vendored_module) for vendored_module in modules_list
|
||||
if vendored_module.startswith(vendored_name))
|
||||
|
||||
def get_vendored_aliases(self, module_name):
|
||||
vendored_name = f"setuptools._vendor.{module_name}"
|
||||
return self._create_vendored_aliases(vendored_name, module_name, self.vendored_modules)
|
||||
|
||||
def get_distutils_aliases(self):
|
||||
vendored_name = "setuptools._distutils"
|
||||
return self._create_vendored_aliases(vendored_name, "distutils", self.distutils_modules)
|
||||
|
||||
|
||||
setuptools_info = SetuptoolsInfo()
|
||||
|
||||
|
||||
def pre_safe_import_module(api):
|
||||
"""
|
||||
A common implementation of pre_safe_import_module hook function.
|
||||
|
||||
This function can be either called from the `pre_safe_import_module` function in a pre-safe-import-module hook, or
|
||||
just imported into the hook.
|
||||
"""
|
||||
module_name = api.module_name
|
||||
|
||||
# Check if the package/module is a vendored copy. This also returns False is setuptools is unavailable, because
|
||||
# vendored module status dictionary will be empty.
|
||||
if not setuptools_info.is_vendored(module_name):
|
||||
return
|
||||
|
||||
vendored_name = f"setuptools._vendor.{module_name}"
|
||||
logger.info(
|
||||
"Setuptools: %r appears to be a setuptools-vendored copy - creating alias to %r!", module_name, vendored_name
|
||||
)
|
||||
|
||||
# Create aliases for all (sub)modules
|
||||
for aliased_name, real_vendored_name in setuptools_info.get_vendored_aliases(module_name):
|
||||
api.add_alias_module(real_vendored_name, aliased_name)
|
||||
@@ -0,0 +1,345 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
|
||||
import os
|
||||
import fnmatch
|
||||
|
||||
from PyInstaller import compat
|
||||
from PyInstaller import isolated
|
||||
from PyInstaller import log as logging
|
||||
from PyInstaller.depend import bindepend
|
||||
|
||||
if compat.is_darwin:
|
||||
from PyInstaller.utils import osx as osxutils
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@isolated.decorate
|
||||
def _get_tcl_tk_info():
|
||||
"""
|
||||
Isolated-subprocess helper to retrieve the basic Tcl/Tk information:
|
||||
- tkinter_extension_file = the value of __file__ attribute of the _tkinter binary extension (path to file).
|
||||
- tcl_data_dir = path to the Tcl library/data directory.
|
||||
- tcl_version = Tcl version
|
||||
- tk_version = Tk version
|
||||
- tcl_theaded = boolean indicating whether Tcl/Tk is built with multi-threading support.
|
||||
"""
|
||||
try:
|
||||
import tkinter
|
||||
import _tkinter
|
||||
except ImportError:
|
||||
# tkinter unavailable
|
||||
return None
|
||||
try:
|
||||
tcl = tkinter.Tcl()
|
||||
except tkinter.TclError: # e.g. "Can't find a usable init.tcl in the following directories: ..."
|
||||
return None
|
||||
|
||||
# Query the location of Tcl library/data directory.
|
||||
tcl_data_dir = tcl.eval("info library")
|
||||
|
||||
# Check if Tcl/Tk is built with multi-threaded support (built with --enable-threads), as indicated by the presence
|
||||
# of optional `threaded` member in `tcl_platform` array.
|
||||
try:
|
||||
tcl.getvar("tcl_platform(threaded)") # Ignore the actual value.
|
||||
tcl_threaded = True
|
||||
except tkinter.TclError:
|
||||
tcl_threaded = False
|
||||
|
||||
return {
|
||||
"available": True,
|
||||
# If `_tkinter` is a built-in (as opposed to an extension), it does not have a `__file__` attribute.
|
||||
"tkinter_extension_file": getattr(_tkinter, '__file__', None),
|
||||
"tcl_version": _tkinter.TCL_VERSION,
|
||||
"tk_version": _tkinter.TK_VERSION,
|
||||
"tcl_threaded": tcl_threaded,
|
||||
"tcl_data_dir": tcl_data_dir,
|
||||
}
|
||||
|
||||
|
||||
class TclTkInfo:
|
||||
# Root directory names of Tcl and Tk library/data directories in the frozen application. These directories are
|
||||
# originally fully versioned (e.g., tcl8.6 and tk8.6); we want to remap them to unversioned variants, so that our
|
||||
# run-time hook (pyi_rthook__tkinter.py) does not have to determine version numbers when setting `TCL_LIBRARY`
|
||||
# and `TK_LIBRARY` environment variables.
|
||||
#
|
||||
# We also cannot use plain "tk" and "tcl", because on macOS, the Tcl and Tk shared libraries might come from
|
||||
# framework bundles, and would therefore end up being collected as "Tcl" and "Tk" in the top-level application
|
||||
# directory, causing clash due to filesystem being case-insensitive by default.
|
||||
TCL_ROOTNAME = '_tcl_data'
|
||||
TK_ROOTNAME = '_tk_data'
|
||||
|
||||
def __init__(self):
|
||||
pass
|
||||
|
||||
def __repr__(self):
|
||||
return "TclTkInfo"
|
||||
|
||||
# Delay initialization of Tcl/Tk information until until the corresponding attributes are first requested.
|
||||
def __getattr__(self, name):
|
||||
if 'available' in self.__dict__:
|
||||
# Initialization was already done, but requested attribute is not available.
|
||||
raise AttributeError(name)
|
||||
|
||||
# Load Qt library info...
|
||||
self._load_tcl_tk_info()
|
||||
# ... and return the requested attribute
|
||||
return getattr(self, name)
|
||||
|
||||
def _load_tcl_tk_info(self):
|
||||
logger.info("%s: initializing cached Tcl/Tk info...", self)
|
||||
|
||||
# Initialize variables so that they might be accessed even if tkinter/Tcl/Tk is unavailable or if initialization
|
||||
# fails for some reason.
|
||||
self.available = False
|
||||
self.tkinter_extension_file = None
|
||||
self.tcl_version = None
|
||||
self.tk_version = None
|
||||
self.tcl_threaded = False
|
||||
self.tcl_data_dir = None
|
||||
|
||||
self.tk_data_dir = None
|
||||
self.tcl_module_dir = None
|
||||
|
||||
self.is_macos_system_framework = False
|
||||
self.tcl_shared_library = None
|
||||
self.tk_shared_library = None
|
||||
|
||||
self.data_files = []
|
||||
|
||||
try:
|
||||
tcl_tk_info = _get_tcl_tk_info()
|
||||
except Exception as e:
|
||||
logger.warning("%s: failed to obtain Tcl/Tk info: %s", self, e)
|
||||
return
|
||||
|
||||
# If tkinter could not be imported, `_get_tcl_tk_info` returns None. In such cases, emit a debug message instead
|
||||
# of a warning, because this initialization might be triggered by a helper function that is trying to determine
|
||||
# availability of `tkinter` by inspecting the `available` attribute.
|
||||
if tcl_tk_info is None:
|
||||
logger.debug("%s: failed to obtain Tcl/Tk info: tkinter/_tkinter could not be imported.", self)
|
||||
return
|
||||
|
||||
# Copy properties
|
||||
for key, value in tcl_tk_info.items():
|
||||
setattr(self, key, value)
|
||||
|
||||
# Parse Tcl/Tk version into (major, minor) tuple.
|
||||
self.tcl_version = tuple((int(x) for x in self.tcl_version.split(".")[:2]))
|
||||
self.tk_version = tuple((int(x) for x in self.tk_version.split(".")[:2]))
|
||||
|
||||
# Determine full path to Tcl and Tk shared libraries against which the `_tkinter` extension module is linked.
|
||||
# This can only be done when `_tkinter` is in fact an extension, and not a built-in. In the latter case, the
|
||||
# Tcl/Tk libraries are statically linked into python shared library, so there are no shared libraries for us
|
||||
# to discover.
|
||||
if self.tkinter_extension_file:
|
||||
try:
|
||||
(
|
||||
self.tcl_shared_library,
|
||||
self.tk_shared_library,
|
||||
) = self._find_tcl_tk_shared_libraries(self.tkinter_extension_file)
|
||||
except Exception:
|
||||
logger.warning("%s: failed to determine Tcl and Tk shared library location!", self, exc_info=True)
|
||||
|
||||
# macOS: check if _tkinter is linked against system-provided Tcl.framework and Tk.framework. This is the
|
||||
# case with python3 from XCode tools (and was the case with very old homebrew python builds). In such cases,
|
||||
# we should not be collecting Tcl/Tk files.
|
||||
if compat.is_darwin:
|
||||
self.is_macos_system_framework = self._check_macos_system_framework(self.tcl_shared_library)
|
||||
|
||||
# Emit a warning in the unlikely event that we are dealing with Teapot-distributed version of ActiveTcl.
|
||||
if not self.is_macos_system_framework:
|
||||
self._warn_if_using_activetcl_or_teapot(self.tcl_data_dir)
|
||||
|
||||
# Infer location of Tk library/data directory. Ideally, we could infer this by running
|
||||
#
|
||||
# import tkinter
|
||||
# root = tkinter.Tk()
|
||||
# tk_data_dir = root.tk.exprstring('$tk_library')
|
||||
#
|
||||
# in the isolated subprocess as part of `_get_tcl_tk_info`. However, that is impractical, as it shows the empty
|
||||
# window, and on some platforms (e.g., linux) requires display server. Therefore, try to guess the location,
|
||||
# based on the following heuristic:
|
||||
# - if Tk is built as macOS framework bundle, look for Scripts sub-directory in Resources directory next to
|
||||
# the shared library.
|
||||
# - otherwise, look for: $tcl_root/../tkX.Y, where X and Y are Tk major and minor version.
|
||||
if compat.is_darwin and self.tk_shared_library and (
|
||||
# is_framework_bundle_lib handles only fully-versioned framework library paths...
|
||||
(osxutils.is_framework_bundle_lib(self.tk_shared_library)) or
|
||||
# ... so manually handle top-level-symlinked variant for now.
|
||||
(self.tk_shared_library).endswith("Tk.framework/Tk")
|
||||
):
|
||||
# Fully resolve the library path, in case it is a top-level symlink; for example, resolve
|
||||
# /Library/Frameworks/Python.framework/Versions/3.13/Frameworks/Tk.framework/Tk
|
||||
# into
|
||||
# /Library/Frameworks/Python.framework/Versions/3.13/Frameworks/Tk.framework/Versions/8.6/Tk
|
||||
tk_lib_realpath = os.path.realpath(self.tk_shared_library)
|
||||
# Resources/Scripts directory next to the shared library
|
||||
self.tk_data_dir = os.path.join(os.path.dirname(tk_lib_realpath), "Resources", "Scripts")
|
||||
else:
|
||||
self.tk_data_dir = os.path.join(
|
||||
os.path.dirname(self.tcl_data_dir),
|
||||
f"tk{self.tk_version[0]}.{self.tk_version[1]}",
|
||||
)
|
||||
|
||||
# Infer location of Tcl module directory. The modules directory is separate from the library/data one, and
|
||||
# is located at $tcl_root/../tclX, where X is the major Tcl version.
|
||||
self.tcl_module_dir = os.path.join(
|
||||
os.path.dirname(self.tcl_data_dir),
|
||||
f"tcl{self.tcl_version[0]}",
|
||||
)
|
||||
|
||||
# Find all data files
|
||||
if self.is_macos_system_framework:
|
||||
logger.info("%s: using macOS system Tcl/Tk framework - not collecting data files.", self)
|
||||
else:
|
||||
# Collect Tcl and Tk scripts from their corresponding library/data directories. See comment at the
|
||||
# definition of TK_ROOTNAME and TK_ROOTNAME variables.
|
||||
if os.path.isdir(self.tcl_data_dir):
|
||||
self.data_files += self._collect_files_from_directory(
|
||||
self.tcl_data_dir,
|
||||
prefix=self.TCL_ROOTNAME,
|
||||
excludes=['demos', '*.lib', 'tclConfig.sh'],
|
||||
)
|
||||
else:
|
||||
logger.warning("%s: Tcl library/data directory %r does not exist!", self, self.tcl_data_dir)
|
||||
|
||||
if os.path.isdir(self.tk_data_dir):
|
||||
self.data_files += self._collect_files_from_directory(
|
||||
self.tk_data_dir,
|
||||
prefix=self.TK_ROOTNAME,
|
||||
excludes=['demos', '*.lib', 'tkConfig.sh'],
|
||||
)
|
||||
else:
|
||||
logger.warning("%s: Tk library/data directory %r does not exist!", self, self.tk_data_dir)
|
||||
|
||||
# Collect Tcl modules from modules directory
|
||||
if os.path.isdir(self.tcl_module_dir):
|
||||
self.data_files += self._collect_files_from_directory(
|
||||
self.tcl_module_dir,
|
||||
prefix=os.path.basename(self.tcl_module_dir),
|
||||
)
|
||||
else:
|
||||
logger.warning("%s: Tcl module directory %r does not exist!", self, self.tcl_module_dir)
|
||||
|
||||
@staticmethod
|
||||
def _collect_files_from_directory(root, prefix=None, excludes=None):
|
||||
"""
|
||||
A minimal port of PyInstaller.building.datastruct.Tree() functionality, which allows us to avoid using Tree
|
||||
here. This way, the TclTkInfo data structure can be used without having PyInstaller's config context set up.
|
||||
"""
|
||||
excludes = excludes or []
|
||||
|
||||
todo = [(root, prefix)]
|
||||
output = []
|
||||
while todo:
|
||||
target_dir, prefix = todo.pop()
|
||||
|
||||
for entry in os.listdir(target_dir):
|
||||
# Basic name-based exclusion
|
||||
if any((fnmatch.fnmatch(entry, exclude) for exclude in excludes)):
|
||||
continue
|
||||
|
||||
src_path = os.path.join(target_dir, entry)
|
||||
dest_path = os.path.join(prefix, entry) if prefix else entry
|
||||
|
||||
if os.path.isdir(src_path):
|
||||
todo.append((src_path, dest_path))
|
||||
else:
|
||||
# Return 3-element tuples with fully-resolved dest path, since other parts of code depend on that.
|
||||
output.append((dest_path, src_path, 'DATA'))
|
||||
|
||||
return output
|
||||
|
||||
@staticmethod
|
||||
def _find_tcl_tk_shared_libraries(tkinter_ext_file):
|
||||
"""
|
||||
Find Tcl and Tk shared libraries against which the _tkinter extension module is linked.
|
||||
"""
|
||||
tcl_lib = None
|
||||
tk_lib = None
|
||||
|
||||
for _, lib_path in bindepend.get_imports(tkinter_ext_file): # (name, fullpath) tuple
|
||||
if lib_path is None:
|
||||
continue # Skip unresolved entries
|
||||
|
||||
# For comparison, take basename of lib_path. On macOS, lib_name returned by get_imports is in fact
|
||||
# referenced name, which is not necessarily just a basename.
|
||||
lib_name = os.path.basename(lib_path)
|
||||
lib_name_lower = lib_name.lower() # lower-case for comparisons
|
||||
|
||||
if 'tcl' in lib_name_lower:
|
||||
tcl_lib = lib_path
|
||||
elif 'tk' in lib_name_lower:
|
||||
tk_lib = lib_path
|
||||
|
||||
return tcl_lib, tk_lib
|
||||
|
||||
@staticmethod
|
||||
def _check_macos_system_framework(tcl_shared_lib):
|
||||
# Starting with macOS 11, system libraries are hidden (unless both Python and PyInstaller's bootloader are built
|
||||
# against macOS 11.x SDK). Therefore, Tcl shared library might end up unresolved (None); but that implicitly
|
||||
# indicates that the system framework is used.
|
||||
if tcl_shared_lib is None:
|
||||
return True
|
||||
|
||||
# Check if the path corresponds to the system framework, i.e., [/System]/Library/Frameworks/Tcl.framework/Tcl
|
||||
return 'Library/Frameworks/Tcl.framework' in tcl_shared_lib
|
||||
|
||||
@staticmethod
|
||||
def _warn_if_using_activetcl_or_teapot(tcl_root):
|
||||
"""
|
||||
Check if Tcl installation is a Teapot-distributed version of ActiveTcl, and log a non-fatal warning that the
|
||||
resulting frozen application will (likely) fail to run on other systems.
|
||||
|
||||
PyInstaller does *not* freeze all ActiveTcl dependencies -- including Teapot, which is typically ignorable.
|
||||
Since Teapot is *not* ignorable in this case, this function warns of impending failure.
|
||||
|
||||
See Also
|
||||
-------
|
||||
https://github.com/pyinstaller/pyinstaller/issues/621
|
||||
"""
|
||||
if tcl_root is None:
|
||||
return
|
||||
|
||||
# Read the "init.tcl" script and look for mentions of "activetcl" and "teapot"
|
||||
init_tcl = os.path.join(tcl_root, 'init.tcl')
|
||||
if not os.path.isfile(init_tcl):
|
||||
return
|
||||
|
||||
mentions_activetcl = False
|
||||
mentions_teapot = False
|
||||
|
||||
# Tcl/Tk reads files using the system encoding (https://www.tcl.tk/doc/howto/i18n.html#system_encoding);
|
||||
# on macOS, this is UTF-8.
|
||||
with open(init_tcl, 'r', encoding='utf8') as fp:
|
||||
for line in fp.readlines():
|
||||
line = line.strip().lower()
|
||||
if line.startswith('#'):
|
||||
continue
|
||||
if 'activetcl' in line:
|
||||
mentions_activetcl = True
|
||||
if 'teapot' in line:
|
||||
mentions_teapot = True
|
||||
if mentions_activetcl and mentions_teapot:
|
||||
break
|
||||
|
||||
if mentions_activetcl and mentions_teapot:
|
||||
logger.warning(
|
||||
"You appear to be using an ActiveTcl build of Tcl/Tk, which PyInstaller has\n"
|
||||
"difficulty freezing. To fix this, comment out all references to 'teapot' in\n"
|
||||
f"{init_tcl!r}\n"
|
||||
"See https://github.com/pyinstaller/pyinstaller/issues/621 for more information."
|
||||
)
|
||||
|
||||
|
||||
tcltk_info = TclTkInfo()
|
||||
229
Utils/PythonNew32/Lib/site-packages/PyInstaller/utils/misc.py
Normal file
229
Utils/PythonNew32/Lib/site-packages/PyInstaller/utils/misc.py
Normal file
@@ -0,0 +1,229 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
This module contains miscellaneous functions that do not fit anywhere else.
|
||||
"""
|
||||
|
||||
import glob
|
||||
import os
|
||||
import pprint
|
||||
import codecs
|
||||
import re
|
||||
import tokenize
|
||||
import io
|
||||
import pathlib
|
||||
|
||||
from PyInstaller import log as logging
|
||||
from PyInstaller.compat import is_win
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def dlls_in_subdirs(directory):
|
||||
"""
|
||||
Returns a list *.dll, *.so, *.dylib in the given directory and its subdirectories.
|
||||
"""
|
||||
filelist = []
|
||||
for root, dirs, files in os.walk(directory):
|
||||
filelist.extend(dlls_in_dir(root))
|
||||
return filelist
|
||||
|
||||
|
||||
def dlls_in_dir(directory):
|
||||
"""
|
||||
Returns a list of *.dll, *.so, *.dylib in the given directory.
|
||||
"""
|
||||
return files_in_dir(directory, ["*.so", "*.dll", "*.dylib"])
|
||||
|
||||
|
||||
def files_in_dir(directory, file_patterns=None):
|
||||
"""
|
||||
Returns a list of files in the given directory that match the given pattern.
|
||||
"""
|
||||
|
||||
file_patterns = file_patterns or []
|
||||
|
||||
files = []
|
||||
for file_pattern in file_patterns:
|
||||
files.extend(glob.glob(os.path.join(directory, file_pattern)))
|
||||
return files
|
||||
|
||||
|
||||
def get_path_to_toplevel_modules(filename):
|
||||
"""
|
||||
Return the path to top-level directory that contains Python modules.
|
||||
|
||||
It will look in parent directories for __init__.py files. The first parent directory without __init__.py is the
|
||||
top-level directory.
|
||||
|
||||
Returned directory might be used to extend the PYTHONPATH.
|
||||
"""
|
||||
curr_dir = os.path.dirname(os.path.abspath(filename))
|
||||
pattern = '__init__.py'
|
||||
|
||||
# Try max. 10 levels up.
|
||||
try:
|
||||
for i in range(10):
|
||||
files = set(os.listdir(curr_dir))
|
||||
# 'curr_dir' is still not top-level; go to parent dir.
|
||||
if pattern in files:
|
||||
curr_dir = os.path.dirname(curr_dir)
|
||||
# Top-level dir found; return it.
|
||||
else:
|
||||
return curr_dir
|
||||
except IOError:
|
||||
pass
|
||||
# No top-level directory found, or error was encountered.
|
||||
return None
|
||||
|
||||
|
||||
def mtime(fnm):
|
||||
try:
|
||||
# TODO: explain why this does not use os.path.getmtime() ?
|
||||
# - It is probably not used because it returns float and not int.
|
||||
return os.stat(fnm)[8]
|
||||
except Exception:
|
||||
return 0
|
||||
|
||||
|
||||
def save_py_data_struct(filename, data):
|
||||
"""
|
||||
Save data into text file as Python data structure.
|
||||
:param filename:
|
||||
:param data:
|
||||
:return:
|
||||
"""
|
||||
dirname = os.path.dirname(filename)
|
||||
if not os.path.exists(dirname):
|
||||
os.makedirs(dirname)
|
||||
with open(filename, 'w', encoding='utf-8') as f:
|
||||
pprint.pprint(data, f)
|
||||
|
||||
|
||||
def load_py_data_struct(filename):
|
||||
"""
|
||||
Load data saved as python code and interpret that code.
|
||||
:param filename:
|
||||
:return:
|
||||
"""
|
||||
with open(filename, 'r', encoding='utf-8') as f:
|
||||
if is_win:
|
||||
# import versioninfo so that VSVersionInfo can parse correctly.
|
||||
from PyInstaller.utils.win32 import versioninfo # noqa: F401
|
||||
|
||||
return eval(f.read())
|
||||
|
||||
|
||||
def absnormpath(apath):
|
||||
return os.path.abspath(os.path.normpath(apath))
|
||||
|
||||
|
||||
def module_parent_packages(full_modname):
|
||||
"""
|
||||
Return list of parent package names.
|
||||
'aaa.bb.c.dddd' -> ['aaa', 'aaa.bb', 'aaa.bb.c']
|
||||
:param full_modname: Full name of a module.
|
||||
:return: List of parent module names.
|
||||
"""
|
||||
prefix = ''
|
||||
parents = []
|
||||
# Ignore the last component in module name and get really just parent, grandparent, great grandparent, etc.
|
||||
for pkg in full_modname.split('.')[0:-1]:
|
||||
# Ensure that first item does not start with dot '.'
|
||||
prefix += '.' + pkg if prefix else pkg
|
||||
parents.append(prefix)
|
||||
return parents
|
||||
|
||||
|
||||
def is_file_qt_plugin(filename):
|
||||
"""
|
||||
Check if the given file is a Qt plugin file.
|
||||
:param filename: Full path to file to check.
|
||||
:return: True if given file is a Qt plugin file, False if not.
|
||||
"""
|
||||
|
||||
# Check the file contents; scan for QTMETADATA string. The scan is based on the brute-force Windows codepath of
|
||||
# findPatternUnloaded() from qtbase/src/corelib/plugin/qlibrary.cpp in Qt5.
|
||||
with open(filename, 'rb') as fp:
|
||||
fp.seek(0, os.SEEK_END)
|
||||
end_pos = fp.tell()
|
||||
|
||||
SEARCH_CHUNK_SIZE = 8192
|
||||
QTMETADATA_MAGIC = b'QTMETADATA '
|
||||
|
||||
magic_offset = -1
|
||||
while end_pos >= len(QTMETADATA_MAGIC):
|
||||
start_pos = max(end_pos - SEARCH_CHUNK_SIZE, 0)
|
||||
chunk_size = end_pos - start_pos
|
||||
# Is the remaining chunk large enough to hold the pattern?
|
||||
if chunk_size < len(QTMETADATA_MAGIC):
|
||||
break
|
||||
# Read and scan the chunk
|
||||
fp.seek(start_pos, os.SEEK_SET)
|
||||
buf = fp.read(chunk_size)
|
||||
pos = buf.rfind(QTMETADATA_MAGIC)
|
||||
if pos != -1:
|
||||
magic_offset = start_pos + pos
|
||||
break
|
||||
# Adjust search location for next chunk; ensure proper overlap.
|
||||
end_pos = start_pos + len(QTMETADATA_MAGIC) - 1
|
||||
if magic_offset == -1:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
BOM_MARKERS_TO_DECODERS = {
|
||||
codecs.BOM_UTF32_LE: codecs.utf_32_le_decode,
|
||||
codecs.BOM_UTF32_BE: codecs.utf_32_be_decode,
|
||||
codecs.BOM_UTF32: codecs.utf_32_decode,
|
||||
codecs.BOM_UTF16_LE: codecs.utf_16_le_decode,
|
||||
codecs.BOM_UTF16_BE: codecs.utf_16_be_decode,
|
||||
codecs.BOM_UTF16: codecs.utf_16_decode,
|
||||
codecs.BOM_UTF8: codecs.utf_8_decode,
|
||||
}
|
||||
BOM_RE = re.compile(rb"\A(%s)?(.*)" % b"|".join(map(re.escape, BOM_MARKERS_TO_DECODERS)), re.DOTALL)
|
||||
|
||||
|
||||
def decode(raw: bytes):
|
||||
"""
|
||||
Decode bytes to string, respecting and removing any byte-order marks if present, or respecting but not removing any
|
||||
PEP263 encoding comments (# encoding: cp1252).
|
||||
"""
|
||||
bom, raw = BOM_RE.match(raw).groups()
|
||||
if bom:
|
||||
return BOM_MARKERS_TO_DECODERS[bom](raw)[0]
|
||||
|
||||
encoding, _ = tokenize.detect_encoding(io.BytesIO(raw).readline)
|
||||
return raw.decode(encoding)
|
||||
|
||||
|
||||
def is_iterable(arg):
|
||||
"""
|
||||
Check if the passed argument is an iterable."
|
||||
"""
|
||||
try:
|
||||
iter(arg)
|
||||
except TypeError:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def path_to_parent_archive(filename):
|
||||
"""
|
||||
Check if the given file path points to a file inside an existing archive file. Returns first path from the set of
|
||||
parent paths that points to an existing file, or `None` if no such path exists (i.e., file is an actual stand-alone
|
||||
file).
|
||||
"""
|
||||
for parent in pathlib.Path(filename).parents:
|
||||
if parent.is_file():
|
||||
return parent
|
||||
return None
|
||||
735
Utils/PythonNew32/Lib/site-packages/PyInstaller/utils/osx.py
Normal file
735
Utils/PythonNew32/Lib/site-packages/PyInstaller/utils/osx.py
Normal file
@@ -0,0 +1,735 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2014-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
Utils for macOS platform.
|
||||
"""
|
||||
|
||||
import math
|
||||
import os
|
||||
import pathlib
|
||||
import subprocess
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
from macholib.mach_o import (
|
||||
LC_BUILD_VERSION,
|
||||
LC_CODE_SIGNATURE,
|
||||
LC_ID_DYLIB,
|
||||
LC_LOAD_DYLIB,
|
||||
LC_LOAD_UPWARD_DYLIB,
|
||||
LC_LOAD_WEAK_DYLIB,
|
||||
LC_PREBOUND_DYLIB,
|
||||
LC_REEXPORT_DYLIB,
|
||||
LC_RPATH,
|
||||
LC_SEGMENT_64,
|
||||
LC_SYMTAB,
|
||||
LC_UUID,
|
||||
LC_VERSION_MIN_MACOSX,
|
||||
)
|
||||
from macholib.MachO import MachO
|
||||
import macholib.util
|
||||
|
||||
import PyInstaller.log as logging
|
||||
from PyInstaller import compat
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def is_homebrew_env():
|
||||
"""
|
||||
Check if Python interpreter was installed via Homebrew command 'brew'.
|
||||
|
||||
:return: True if Homebrew else otherwise.
|
||||
"""
|
||||
# Python path prefix should start with Homebrew prefix.
|
||||
env_prefix = get_homebrew_prefix()
|
||||
if env_prefix and compat.base_prefix.startswith(env_prefix):
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def is_macports_env():
|
||||
"""
|
||||
Check if Python interpreter was installed via Macports command 'port'.
|
||||
|
||||
:return: True if Macports else otherwise.
|
||||
"""
|
||||
# Python path prefix should start with Macports prefix.
|
||||
env_prefix = get_macports_prefix()
|
||||
if env_prefix and compat.base_prefix.startswith(env_prefix):
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def get_homebrew_prefix():
|
||||
"""
|
||||
:return: Root path of the Homebrew environment.
|
||||
"""
|
||||
prefix = shutil.which('brew')
|
||||
# Conversion: /usr/local/bin/brew -> /usr/local
|
||||
prefix = os.path.dirname(os.path.dirname(prefix))
|
||||
return prefix
|
||||
|
||||
|
||||
def get_macports_prefix():
|
||||
"""
|
||||
:return: Root path of the Macports environment.
|
||||
"""
|
||||
prefix = shutil.which('port')
|
||||
# Conversion: /usr/local/bin/port -> /usr/local
|
||||
prefix = os.path.dirname(os.path.dirname(prefix))
|
||||
return prefix
|
||||
|
||||
|
||||
def _find_version_cmd(header):
|
||||
"""
|
||||
Helper that finds the version command in the given MachO header.
|
||||
"""
|
||||
# The SDK version is stored in LC_BUILD_VERSION command (used when targeting the latest versions of macOS) or in
|
||||
# older LC_VERSION_MIN_MACOSX command. Check for presence of either.
|
||||
version_cmd = [cmd for cmd in header.commands if cmd[0].cmd in {LC_BUILD_VERSION, LC_VERSION_MIN_MACOSX}]
|
||||
assert len(version_cmd) == 1, \
|
||||
f"Expected exactly one LC_BUILD_VERSION or LC_VERSION_MIN_MACOSX command, found {len(version_cmd)}!"
|
||||
return version_cmd[0]
|
||||
|
||||
|
||||
def get_macos_sdk_version(filename):
|
||||
"""
|
||||
Obtain the version of macOS SDK against which the given binary was built.
|
||||
|
||||
NOTE: currently, version is retrieved only from the first arch slice in the binary.
|
||||
|
||||
:return: (major, minor, revision) tuple
|
||||
"""
|
||||
binary = MachO(filename)
|
||||
header = binary.headers[0]
|
||||
# Find version command using helper
|
||||
version_cmd = _find_version_cmd(header)
|
||||
return _hex_triplet(version_cmd[1].sdk)
|
||||
|
||||
|
||||
def _hex_triplet(version):
|
||||
# Parse SDK version number
|
||||
major = (version & 0xFF0000) >> 16
|
||||
minor = (version & 0xFF00) >> 8
|
||||
revision = (version & 0xFF)
|
||||
return major, minor, revision
|
||||
|
||||
|
||||
def macosx_version_min(filename: str) -> tuple:
|
||||
"""
|
||||
Get the -macosx-version-min used to compile a macOS binary.
|
||||
|
||||
For fat binaries, the minimum version is selected.
|
||||
"""
|
||||
versions = []
|
||||
for header in MachO(filename).headers:
|
||||
cmd = _find_version_cmd(header)
|
||||
if cmd[0].cmd == LC_VERSION_MIN_MACOSX:
|
||||
versions.append(cmd[1].version)
|
||||
else:
|
||||
# macOS >= 10.14 uses LC_BUILD_VERSION instead.
|
||||
versions.append(cmd[1].minos)
|
||||
|
||||
return min(map(_hex_triplet, versions))
|
||||
|
||||
|
||||
def set_macos_sdk_version(filename, major, minor, revision):
|
||||
"""
|
||||
Overwrite the macOS SDK version declared in the given binary with the specified version.
|
||||
|
||||
NOTE: currently, only version in the first arch slice is modified.
|
||||
"""
|
||||
# Validate values
|
||||
assert 0 <= major <= 255, "Invalid major version value!"
|
||||
assert 0 <= minor <= 255, "Invalid minor version value!"
|
||||
assert 0 <= revision <= 255, "Invalid revision value!"
|
||||
# Open binary
|
||||
binary = MachO(filename)
|
||||
header = binary.headers[0]
|
||||
# Find version command using helper
|
||||
version_cmd = _find_version_cmd(header)
|
||||
# Write new SDK version number
|
||||
version_cmd[1].sdk = major << 16 | minor << 8 | revision
|
||||
# Write changes back.
|
||||
with open(binary.filename, 'rb+') as fp:
|
||||
binary.write(fp)
|
||||
|
||||
|
||||
def fix_exe_for_code_signing(filename):
|
||||
"""
|
||||
Fixes the Mach-O headers to make code signing possible.
|
||||
|
||||
Code signing on macOS does not work out of the box with embedding .pkg archive into the executable.
|
||||
|
||||
The fix is done this way:
|
||||
- Make the embedded .pkg archive part of the Mach-O 'String Table'. 'String Table' is at end of the macOS exe file,
|
||||
so just change the size of the table to cover the end of the file.
|
||||
- Fix the size of the __LINKEDIT segment.
|
||||
|
||||
Note: the above fix works only if the single-arch thin executable or the last arch slice in a multi-arch fat
|
||||
executable is not signed, because LC_CODE_SIGNATURE comes after LC_SYMTAB, and because modification of headers
|
||||
invalidates the code signature. On modern arm64 macOS, code signature is mandatory, and therefore compilers
|
||||
create a dummy signature when executable is built. In such cases, that signature needs to be removed before this
|
||||
function is called.
|
||||
|
||||
Mach-O format specification: http://developer.apple.com/documentation/Darwin/Reference/ManPages/man5/Mach-O.5.html
|
||||
"""
|
||||
# Estimate the file size after data was appended
|
||||
file_size = os.path.getsize(filename)
|
||||
|
||||
# Take the last available header. A single-arch thin binary contains a single slice, while a multi-arch fat binary
|
||||
# contains multiple, and we need to modify the last one, which is adjacent to the appended data.
|
||||
executable = MachO(filename)
|
||||
header = executable.headers[-1]
|
||||
|
||||
# Sanity check: ensure the executable slice is not signed (otherwise signature's section comes last in the
|
||||
# __LINKEDIT segment).
|
||||
sign_sec = [cmd for cmd in header.commands if cmd[0].cmd == LC_CODE_SIGNATURE]
|
||||
assert len(sign_sec) == 0, "Executable contains code signature!"
|
||||
|
||||
# Find __LINKEDIT segment by name (16-byte zero padded string)
|
||||
__LINKEDIT_NAME = b'__LINKEDIT\x00\x00\x00\x00\x00\x00'
|
||||
linkedit_seg = [cmd for cmd in header.commands if cmd[0].cmd == LC_SEGMENT_64 and cmd[1].segname == __LINKEDIT_NAME]
|
||||
assert len(linkedit_seg) == 1, "Expected exactly one __LINKEDIT segment!"
|
||||
linkedit_seg = linkedit_seg[0][1] # Take the segment command entry
|
||||
# Find SYMTAB section
|
||||
symtab_sec = [cmd for cmd in header.commands if cmd[0].cmd == LC_SYMTAB]
|
||||
assert len(symtab_sec) == 1, "Expected exactly one SYMTAB section!"
|
||||
symtab_sec = symtab_sec[0][1] # Take the symtab command entry
|
||||
|
||||
# The string table is located at the end of the SYMTAB section, which in turn is the last section in the __LINKEDIT
|
||||
# segment. Therefore, the end of SYMTAB section should be aligned with the end of __LINKEDIT segment, and in turn
|
||||
# both should be aligned with the end of the file (as we are in the last or the only arch slice).
|
||||
#
|
||||
# However, when removing the signature from the executable using codesign under macOS 10.13, the codesign utility
|
||||
# may produce an invalid file, with the declared length of the __LINKEDIT segment (linkedit_seg.filesize) pointing
|
||||
# beyond the end of file, as reported in issue #6167.
|
||||
#
|
||||
# We can compensate for that by not using the declared sizes anywhere, and simply recompute them. In the final
|
||||
# binary, the __LINKEDIT segment and the SYMTAB section MUST end at the end of the file (otherwise, we have bigger
|
||||
# issues...). So simply recompute the declared sizes as difference between the final file length and the
|
||||
# corresponding start offset (NOTE: the offset is relative to start of the slice, which is stored in header.offset.
|
||||
# In thin binaries, header.offset is zero and start offset is relative to the start of file, but with fat binaries,
|
||||
# header.offset is non-zero)
|
||||
symtab_sec.strsize = file_size - (header.offset + symtab_sec.stroff)
|
||||
linkedit_seg.filesize = file_size - (header.offset + linkedit_seg.fileoff)
|
||||
|
||||
# Compute new vmsize by rounding filesize up to full page size.
|
||||
page_size = (0x4000 if _get_arch_string(header.header).startswith('arm64') else 0x1000)
|
||||
linkedit_seg.vmsize = math.ceil(linkedit_seg.filesize / page_size) * page_size
|
||||
|
||||
# NOTE: according to spec, segments need to be aligned to page boundaries: 0x4000 (16 kB) for arm64, 0x1000 (4 kB)
|
||||
# for other arches. But it seems we can get away without rounding and padding the segment file size - perhaps
|
||||
# because it is the last one?
|
||||
|
||||
# Write changes
|
||||
with open(filename, 'rb+') as fp:
|
||||
executable.write(fp)
|
||||
|
||||
# In fat binaries, we also need to adjust the fat header. macholib as of version 1.14 does not support this, so we
|
||||
# need to do it ourselves...
|
||||
if executable.fat:
|
||||
from macholib.mach_o import (FAT_MAGIC, FAT_MAGIC_64, fat_arch, fat_arch64, fat_header)
|
||||
with open(filename, 'rb+') as fp:
|
||||
# Taken from MachO.load_fat() implementation. The fat header's signature has already been validated when we
|
||||
# loaded the file for the first time.
|
||||
fat = fat_header.from_fileobj(fp)
|
||||
if fat.magic == FAT_MAGIC:
|
||||
archs = [fat_arch.from_fileobj(fp) for i in range(fat.nfat_arch)]
|
||||
elif fat.magic == FAT_MAGIC_64:
|
||||
archs = [fat_arch64.from_fileobj(fp) for i in range(fat.nfat_arch)]
|
||||
# Adjust the size in the fat header for the last slice.
|
||||
arch = archs[-1]
|
||||
arch.size = file_size - arch.offset
|
||||
# Now write the fat headers back to the file.
|
||||
fp.seek(0)
|
||||
fat.to_fileobj(fp)
|
||||
for arch in archs:
|
||||
arch.to_fileobj(fp)
|
||||
|
||||
|
||||
def _get_arch_string(header):
|
||||
"""
|
||||
Converts cputype and cpusubtype from mach_o.mach_header_64 into arch string comparible with lipo/codesign.
|
||||
The list of supported architectures can be found in man(1) arch.
|
||||
"""
|
||||
# NOTE: the constants below are taken from macholib.mach_o
|
||||
cputype = header.cputype
|
||||
cpusubtype = header.cpusubtype & 0x0FFFFFFF
|
||||
if cputype == 0x01000000 | 7:
|
||||
if cpusubtype == 8:
|
||||
return 'x86_64h' # 64-bit intel (haswell)
|
||||
else:
|
||||
return 'x86_64' # 64-bit intel
|
||||
elif cputype == 0x01000000 | 12:
|
||||
if cpusubtype == 2:
|
||||
return 'arm64e'
|
||||
else:
|
||||
return 'arm64'
|
||||
elif cputype == 7:
|
||||
return 'i386' # 32-bit intel
|
||||
assert False, 'Unhandled architecture!'
|
||||
|
||||
|
||||
def update_exe_identifier(filename, pkg_filename):
|
||||
"""
|
||||
Modifies the Mach-O image UUID stored in the LC_UUID command (if present) in order to ensure that different
|
||||
frozen applications have different identifiers. See TN3178 for details on why this is required:
|
||||
https://developer.apple.com/documentation/technotes/tn3178-checking-for-and-resolving-build-uuid-problems
|
||||
"""
|
||||
|
||||
# Compute hash of the PKG
|
||||
import hashlib
|
||||
pkg_hash = hashlib.sha1()
|
||||
with open(pkg_filename, 'rb') as fp:
|
||||
for chunk in iter(lambda: fp.read(8192), b""):
|
||||
pkg_hash.update(chunk)
|
||||
|
||||
# Modify UUID in all arch slices of the executable.
|
||||
executable = MachO(filename)
|
||||
for header in executable.headers:
|
||||
# Find LC_UUID command
|
||||
uuid_cmd = [cmd for cmd in header.commands if cmd[0].cmd == LC_UUID]
|
||||
if not uuid_cmd:
|
||||
continue
|
||||
uuid_cmd = uuid_cmd[0]
|
||||
|
||||
# Read the existing UUID (which is based on bootloader executable itself).
|
||||
original_uuid = uuid_cmd[1].uuid
|
||||
|
||||
# Add original UUID to the hash; this is similar to what UUID v3/v5 do with namespace + name, except
|
||||
# that in our case, the prefix UUID (namespace) is added at the end, so that PKG hash needs to be
|
||||
# (pre)computed only once.
|
||||
combined_hash = pkg_hash.copy()
|
||||
combined_hash.update(original_uuid)
|
||||
|
||||
new_uuid = combined_hash.digest()[:16] # Same as uuid.uuid3() / uuid.uuid5().
|
||||
assert len(new_uuid) == 16
|
||||
|
||||
uuid_cmd[1].uuid = new_uuid
|
||||
|
||||
# Write changes
|
||||
with open(filename, 'rb+') as fp:
|
||||
executable.write(fp)
|
||||
|
||||
|
||||
class InvalidBinaryError(Exception):
|
||||
"""
|
||||
Exception raised by `get_binary_architectures` when it is passed an invalid binary.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
class IncompatibleBinaryArchError(Exception):
|
||||
"""
|
||||
Exception raised by `binary_to_target_arch` when the passed binary fails the strict architecture check.
|
||||
"""
|
||||
def __init__(self, message):
|
||||
url = "https://pyinstaller.org/en/stable/feature-notes.html#macos-multi-arch-support"
|
||||
super().__init__(f"{message} For details about this error message, see: {url}")
|
||||
|
||||
|
||||
def get_binary_architectures(filename):
|
||||
"""
|
||||
Inspects the given binary and returns tuple (is_fat, archs), where is_fat is boolean indicating fat/thin binary,
|
||||
and arch is list of architectures with lipo/codesign compatible names.
|
||||
"""
|
||||
try:
|
||||
executable = MachO(filename)
|
||||
except ValueError as e:
|
||||
raise InvalidBinaryError("Invalid Mach-O binary!") from e
|
||||
return bool(executable.fat), [_get_arch_string(hdr.header) for hdr in executable.headers]
|
||||
|
||||
|
||||
def convert_binary_to_thin_arch(filename, thin_arch, output_filename=None):
|
||||
"""
|
||||
Convert the given fat binary into thin one with the specified target architecture.
|
||||
"""
|
||||
output_filename = output_filename or filename
|
||||
cmd_args = ['lipo', '-thin', thin_arch, filename, '-output', output_filename]
|
||||
p = subprocess.run(cmd_args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, encoding='utf-8')
|
||||
if p.returncode:
|
||||
raise SystemError(f"lipo command ({cmd_args}) failed with error code {p.returncode}!\noutput: {p.stdout}")
|
||||
|
||||
|
||||
def merge_into_fat_binary(output_filename, *slice_filenames):
|
||||
"""
|
||||
Merge the given single-arch thin binary files into a fat binary.
|
||||
"""
|
||||
cmd_args = ['lipo', '-create', '-output', output_filename, *slice_filenames]
|
||||
p = subprocess.run(cmd_args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, encoding='utf-8')
|
||||
if p.returncode:
|
||||
raise SystemError(f"lipo command ({cmd_args}) failed with error code {p.returncode}!\noutput: {p.stdout}")
|
||||
|
||||
|
||||
def binary_to_target_arch(filename, target_arch, display_name=None):
|
||||
"""
|
||||
Check that the given binary contains required architecture slice(s) and convert the fat binary into thin one,
|
||||
if necessary.
|
||||
"""
|
||||
if not display_name:
|
||||
display_name = filename # Same as input file
|
||||
# Check the binary
|
||||
is_fat, archs = get_binary_architectures(filename)
|
||||
if target_arch == 'universal2':
|
||||
if not is_fat:
|
||||
raise IncompatibleBinaryArchError(f"{display_name} is not a fat binary!")
|
||||
# Assume fat binary is universal2; nothing to do
|
||||
else:
|
||||
if is_fat:
|
||||
if target_arch not in archs:
|
||||
raise IncompatibleBinaryArchError(f"{display_name} does not contain slice for {target_arch}!")
|
||||
# Convert to thin arch
|
||||
logger.debug("Converting fat binary %s (%s) to thin binary (%s)", filename, display_name, target_arch)
|
||||
convert_binary_to_thin_arch(filename, target_arch)
|
||||
else:
|
||||
if target_arch not in archs:
|
||||
raise IncompatibleBinaryArchError(
|
||||
f"{display_name} is incompatible with target arch {target_arch} (has arch: {archs[0]})!"
|
||||
)
|
||||
# Binary has correct arch; nothing to do
|
||||
|
||||
|
||||
def remove_signature_from_binary(filename):
|
||||
"""
|
||||
Remove the signature from all architecture slices of the given binary file using the codesign utility.
|
||||
"""
|
||||
logger.debug("Removing signature from file %r", filename)
|
||||
cmd_args = ['/usr/bin/codesign', '--remove', '--all-architectures', filename]
|
||||
p = subprocess.run(cmd_args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, encoding='utf-8')
|
||||
if p.returncode:
|
||||
raise SystemError(f"codesign command ({cmd_args}) failed with error code {p.returncode}!\noutput: {p.stdout}")
|
||||
|
||||
|
||||
def sign_binary(filename, identity=None, entitlements_file=None, deep=False):
|
||||
"""
|
||||
Sign the binary using codesign utility. If no identity is provided, ad-hoc signing is performed.
|
||||
"""
|
||||
extra_args = []
|
||||
if not identity:
|
||||
identity = '-' # ad-hoc signing
|
||||
else:
|
||||
extra_args.append('--options=runtime') # hardened runtime
|
||||
if entitlements_file:
|
||||
extra_args.append('--entitlements')
|
||||
extra_args.append(entitlements_file)
|
||||
if deep:
|
||||
extra_args.append('--deep')
|
||||
|
||||
logger.debug("Signing file %r", filename)
|
||||
cmd_args = [
|
||||
'/usr/bin/codesign', '-s', identity, '--force', '--all-architectures', '--timestamp', *extra_args, filename
|
||||
]
|
||||
p = subprocess.run(cmd_args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, encoding='utf-8')
|
||||
if p.returncode:
|
||||
raise SystemError(f"codesign command ({cmd_args}) failed with error code {p.returncode}!\noutput: {p.stdout}")
|
||||
|
||||
|
||||
def set_dylib_dependency_paths(filename, target_rpath):
|
||||
"""
|
||||
Modify the given dylib's identity (in LC_ID_DYLIB command) and the paths to dependent dylibs (in LC_LOAD_DYLIB)
|
||||
commands into `@rpath/<basename>` format, remove any existing rpaths (LC_RPATH commands), and add a new rpath
|
||||
(LC_RPATH command) with the specified path.
|
||||
|
||||
Uses `install-tool-name` utility to make the changes.
|
||||
|
||||
The system libraries (e.g., the ones found in /usr/lib) are exempted from path rewrite.
|
||||
|
||||
For multi-arch fat binaries, this function extracts each slice into temporary file, processes it separately,
|
||||
and then merges all processed slices back into fat binary. This is necessary because `install-tool-name` cannot
|
||||
modify rpaths in cases when an existing rpath is present only in one slice.
|
||||
"""
|
||||
|
||||
# Check if we are dealing with a fat binary; the `install-name-tool` seems to be unable to remove an rpath that is
|
||||
# present only in one slice, so we need to extract each slice, process it separately, and then stich processed
|
||||
# slices back into a fat binary.
|
||||
is_fat, archs = get_binary_architectures(filename)
|
||||
|
||||
if is_fat:
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
slice_filenames = []
|
||||
for arch in archs:
|
||||
slice_filename = os.path.join(tmpdir, arch)
|
||||
convert_binary_to_thin_arch(filename, arch, output_filename=slice_filename)
|
||||
_set_dylib_dependency_paths(slice_filename, target_rpath)
|
||||
slice_filenames.append(slice_filename)
|
||||
merge_into_fat_binary(filename, *slice_filenames)
|
||||
else:
|
||||
# Thin binary - we can process it directly
|
||||
_set_dylib_dependency_paths(filename, target_rpath)
|
||||
|
||||
|
||||
def _set_dylib_dependency_paths(filename, target_rpath):
|
||||
"""
|
||||
The actual implementation of set_dylib_dependency_paths functionality.
|
||||
|
||||
Implicitly assumes that a single-arch thin binary is given.
|
||||
"""
|
||||
|
||||
# Relocatable commands that we should overwrite - same list as used by `macholib`.
|
||||
_RELOCATABLE = {
|
||||
LC_LOAD_DYLIB,
|
||||
LC_LOAD_UPWARD_DYLIB,
|
||||
LC_LOAD_WEAK_DYLIB,
|
||||
LC_PREBOUND_DYLIB,
|
||||
LC_REEXPORT_DYLIB,
|
||||
}
|
||||
|
||||
# Parse dylib's header to extract the following commands:
|
||||
# - LC_LOAD_DYLIB (or any member of _RELOCATABLE list): dylib load commands (dependent libraries)
|
||||
# - LC_RPATH: rpath definitions
|
||||
# - LC_ID_DYLIB: dylib's identity
|
||||
binary = MachO(filename)
|
||||
|
||||
dylib_id = None
|
||||
rpaths = set()
|
||||
linked_libs = set()
|
||||
|
||||
for header in binary.headers:
|
||||
for cmd in header.commands:
|
||||
lc_type = cmd[0].cmd
|
||||
if lc_type not in _RELOCATABLE and lc_type not in {LC_RPATH, LC_ID_DYLIB}:
|
||||
continue
|
||||
|
||||
# Decode path, strip trailing NULL characters
|
||||
path = cmd[2].decode('utf-8').rstrip('\x00')
|
||||
|
||||
if lc_type in _RELOCATABLE:
|
||||
linked_libs.add(path)
|
||||
elif lc_type == LC_RPATH:
|
||||
rpaths.add(path)
|
||||
elif lc_type == LC_ID_DYLIB:
|
||||
dylib_id = path
|
||||
|
||||
del binary
|
||||
|
||||
# If dylib has identifier set, compute the normalized version, in form of `@rpath/basename`.
|
||||
normalized_dylib_id = None
|
||||
if dylib_id:
|
||||
normalized_dylib_id = str(pathlib.PurePath('@rpath') / pathlib.PurePath(dylib_id).name)
|
||||
|
||||
# Find dependent libraries that should have their prefix path changed to `@rpath`. If any dependent libraries
|
||||
# end up using `@rpath` (originally or due to rewrite), set the `rpath_required` boolean to True, so we know
|
||||
# that we need to add our rpath.
|
||||
changed_lib_paths = []
|
||||
rpath_required = False
|
||||
for linked_lib in linked_libs:
|
||||
# Leave system dynamic libraries unchanged.
|
||||
if macholib.util.in_system_path(linked_lib):
|
||||
continue
|
||||
|
||||
# The older python.org builds that use system Tcl/Tk framework have their _tkinter.cpython-*-darwin.so
|
||||
# library linked against /Library/Frameworks/Tcl.framework/Versions/8.5/Tcl and
|
||||
# /Library/Frameworks/Tk.framework/Versions/8.5/Tk, although the actual frameworks are located in
|
||||
# /System/Library/Frameworks. Therefore, they slip through the above in_system_path() check, and we need to
|
||||
# exempt them manually.
|
||||
_exemptions = [
|
||||
'/Library/Frameworks/Tcl.framework/',
|
||||
'/Library/Frameworks/Tk.framework/',
|
||||
]
|
||||
if any([x in linked_lib for x in _exemptions]):
|
||||
continue
|
||||
|
||||
# This linked library will end up using `@rpath`, whether modified or not...
|
||||
rpath_required = True
|
||||
|
||||
new_path = str(pathlib.PurePath('@rpath') / pathlib.PurePath(linked_lib).name)
|
||||
if linked_lib == new_path:
|
||||
continue
|
||||
|
||||
changed_lib_paths.append((linked_lib, new_path))
|
||||
|
||||
# Gather arguments for `install-name-tool`
|
||||
install_name_tool_args = []
|
||||
|
||||
# Modify the dylib identifier if necessary
|
||||
if normalized_dylib_id and normalized_dylib_id != dylib_id:
|
||||
install_name_tool_args += ["-id", normalized_dylib_id]
|
||||
|
||||
# Changed libs
|
||||
for original_path, new_path in changed_lib_paths:
|
||||
install_name_tool_args += ["-change", original_path, new_path]
|
||||
|
||||
# Remove all existing rpaths except for the target rpath (if it already exists). `install_name_tool` disallows using
|
||||
# `-delete_rpath` and `-add_rpath` with the same argument.
|
||||
for rpath in rpaths:
|
||||
if rpath == target_rpath:
|
||||
continue
|
||||
install_name_tool_args += [
|
||||
"-delete_rpath",
|
||||
rpath,
|
||||
]
|
||||
|
||||
# If any of linked libraries use @rpath now and our target rpath is not already added, add it.
|
||||
# NOTE: @rpath in the dylib identifier does not actually require the rpath to be set on the binary...
|
||||
if rpath_required and target_rpath not in rpaths:
|
||||
install_name_tool_args += [
|
||||
"-add_rpath",
|
||||
target_rpath,
|
||||
]
|
||||
|
||||
# If we have no arguments, finish immediately.
|
||||
if not install_name_tool_args:
|
||||
return
|
||||
|
||||
# Run `install_name_tool`
|
||||
cmd_args = ["install_name_tool", *install_name_tool_args, filename]
|
||||
p = subprocess.run(cmd_args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, encoding='utf-8')
|
||||
if p.returncode:
|
||||
raise SystemError(
|
||||
f"install_name_tool command ({cmd_args}) failed with error code {p.returncode}!\noutput: {p.stdout}"
|
||||
)
|
||||
|
||||
|
||||
def is_framework_bundle_lib(lib_path):
|
||||
"""
|
||||
Check if the given shared library is part of a .framework bundle.
|
||||
"""
|
||||
|
||||
lib_path = pathlib.PurePath(lib_path)
|
||||
|
||||
# For now, focus only on versioned layout, such as `QtCore.framework/Versions/5/QtCore`
|
||||
if lib_path.parent.parent.name != "Versions":
|
||||
return False
|
||||
if lib_path.parent.parent.parent.name != lib_path.name + ".framework":
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def collect_files_from_framework_bundles(collected_files):
|
||||
"""
|
||||
Scan the given TOC list of collected files for shared libraries that are collected from macOS .framework bundles,
|
||||
and collect the bundles' Info.plist files. Additionally, the following symbolic links:
|
||||
- `Versions/Current` pointing to the `Versions/<version>` directory containing the binary
|
||||
- `<name>` in the top-level .framework directory, pointing to `Versions/Current/<name>`
|
||||
- `Resources` in the top-level .framework directory, pointing to `Versions/Current/Resources`
|
||||
- additional directories in top-level .framework directory, pointing to their counterparts in `Versions/Current`
|
||||
directory.
|
||||
|
||||
Returns TOC list for the discovered Info.plist files and generated symbolic links. The list does not contain
|
||||
duplicated entries.
|
||||
"""
|
||||
invalid_framework_found = False
|
||||
|
||||
framework_files = set() # Additional entries for collected files. Use set for de-duplication.
|
||||
framework_paths = set() # Registered framework paths for 2nd pass.
|
||||
|
||||
# 1st pass: discover binaries from .framework bundles, and for each such binary:
|
||||
# - collect `Info.plist`
|
||||
# - create `Current` -> `<version>` symlink in `<name>.framework/Versions` directory.
|
||||
# - create `<name>.framework/<name>` -> `<name>.framework/Versions/Current/<name>` symlink.
|
||||
# - create `<name>.framework/Resources` -> `<name>.framework/Versions/Current/Resources` symlink.
|
||||
for dest_name, src_name, typecode in collected_files:
|
||||
if typecode != 'BINARY':
|
||||
continue
|
||||
|
||||
src_path = pathlib.Path(src_name) # /src/path/to/<name>.framework/Versions/<version>/<name>
|
||||
dest_path = pathlib.PurePath(dest_name) # /dest/path/to/<name>.framework/Versions/<version>/<name>
|
||||
|
||||
# Check whether binary originates from a .framework bundle
|
||||
if not is_framework_bundle_lib(src_path):
|
||||
continue
|
||||
|
||||
# Check whether binary is also collected into a .framework bundle (i.e., the original layout is preserved)
|
||||
if not is_framework_bundle_lib(dest_path):
|
||||
continue
|
||||
|
||||
# Assuming versioned layout, Info.plist should exist in Resources directory located next to the binary.
|
||||
info_plist_src = src_path.parent / "Resources" / "Info.plist"
|
||||
if not info_plist_src.is_file():
|
||||
# Alas, the .framework bundles shipped with PySide/PyQt might have Info.plist available only in the
|
||||
# top-level Resources directory. So accommodate this scenario as well, but collect the file into
|
||||
# versioned directory to appease the code-signing gods...
|
||||
info_plist_src_top = src_path.parent.parent.parent / "Resources" / "Info.plist"
|
||||
if not info_plist_src_top.is_file():
|
||||
# Strictly speaking, a .framework bundle without Info.plist is invalid. However, that did not prevent
|
||||
# PyQt from shipping such Qt .framework bundles up until v5.14.1. So by default, we just complain via
|
||||
# a warning message; if such binaries work in unfrozen python, they should also work in frozen
|
||||
# application. The codesign will refuse to sign the .app bundle (if we are generating one), but there
|
||||
# is nothing we can do about that.
|
||||
invalid_framework_found = True
|
||||
framework_dir = src_path.parent.parent.parent
|
||||
if compat.strict_collect_mode:
|
||||
raise SystemError(f"Could not find Info.plist in {framework_dir}!")
|
||||
else:
|
||||
logger.warning("Could not find Info.plist in %s!", framework_dir)
|
||||
continue
|
||||
info_plist_src = info_plist_src_top
|
||||
info_plist_dest = dest_path.parent / "Resources" / "Info.plist"
|
||||
framework_files.add((str(info_plist_dest), str(info_plist_src), "DATA"))
|
||||
|
||||
# Reconstruct the symlink Versions/Current -> Versions/<version>.
|
||||
# This one seems to be necessary for code signing, but might be absent from .framework bundles shipped with
|
||||
# python packages. So we always create it ourselves.
|
||||
framework_files.add((str(dest_path.parent.parent / "Current"), str(dest_path.parent.name), "SYMLINK"))
|
||||
|
||||
dest_framework_path = dest_path.parent.parent.parent # Top-level .framework directory path.
|
||||
|
||||
# Symlink the binary in the `Current` directory to the top-level .framework directory.
|
||||
framework_files.add((
|
||||
str(dest_framework_path / dest_path.name),
|
||||
str(pathlib.PurePath("Versions/Current") / dest_path.name),
|
||||
"SYMLINK",
|
||||
))
|
||||
|
||||
# Ditto for the `Resources` directory.
|
||||
framework_files.add((
|
||||
str(dest_framework_path / "Resources"),
|
||||
"Versions/Current/Resources",
|
||||
"SYMLINK",
|
||||
))
|
||||
|
||||
# Register the framework parent path to use in additional directories scan in subsequent pass.
|
||||
framework_paths.add(dest_framework_path)
|
||||
|
||||
# 2nd pass: scan for additional collected directories from .framework bundles, and create symlinks to the top-level
|
||||
# application directory. Make the outer loop go over the registered framework paths, so it becomes no-op if no
|
||||
# framework paths are registered.
|
||||
VALID_SUBDIRS = {'Helpers', 'Resources'}
|
||||
|
||||
for dest_framework_path in framework_paths:
|
||||
for dest_name, src_name, typecode in collected_files:
|
||||
dest_path = pathlib.PurePath(dest_name)
|
||||
|
||||
# Try matching against framework path
|
||||
try:
|
||||
remaining_path = dest_path.relative_to(dest_framework_path)
|
||||
except ValueError: # dest_path is not subpath of dest_framework_path
|
||||
continue
|
||||
|
||||
remaining_path_parts = remaining_path.parts
|
||||
|
||||
# We are interested only in entries under Versions directory.
|
||||
if remaining_path_parts[0] != 'Versions':
|
||||
continue
|
||||
|
||||
# If the entry name is among valid sub-directory names, create symlink.
|
||||
dir_name = remaining_path_parts[2]
|
||||
if dir_name not in VALID_SUBDIRS:
|
||||
continue
|
||||
|
||||
framework_files.add((
|
||||
str(dest_framework_path / dir_name),
|
||||
str(pathlib.PurePath("Versions/Current") / dir_name),
|
||||
"SYMLINK",
|
||||
))
|
||||
|
||||
# If we encountered an invalid .framework bundle without Info.plist, warn the user that code-signing will most
|
||||
# likely fail.
|
||||
if invalid_framework_found:
|
||||
logger.warning(
|
||||
"One or more collected .framework bundles have missing Info.plist file. If you are building an .app "
|
||||
"bundle, you will most likely not be able to code-sign it."
|
||||
)
|
||||
|
||||
return sorted(framework_files)
|
||||
@@ -0,0 +1,70 @@
|
||||
# -----------------------------------------------------------------------------
|
||||
# Copyright (c) 2005-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
# -----------------------------------------------------------------------------
|
||||
|
||||
import argparse
|
||||
import sys
|
||||
|
||||
import pytest
|
||||
|
||||
from PyInstaller.compat import importlib_metadata
|
||||
|
||||
|
||||
def paths_to_test(include_only=None):
|
||||
"""
|
||||
If ``include_only`` is falsey, this functions returns paths from all entry points. Otherwise, this parameter
|
||||
must be a string or sequence of strings. In this case, this function will return *only* paths from entry points
|
||||
whose ``module_name`` begins with the provided string(s).
|
||||
"""
|
||||
# Convert a string to a list.
|
||||
if isinstance(include_only, str):
|
||||
include_only = [include_only]
|
||||
|
||||
# Walk through all entry points.
|
||||
test_path_list = []
|
||||
for entry_point in importlib_metadata.entry_points(group="pyinstaller40", name="tests"):
|
||||
# Implement ``include_only``.
|
||||
if (
|
||||
not include_only # If falsey, include everything,
|
||||
# Otherwise, include only the specified modules.
|
||||
or any(entry_point.module.startswith(name) for name in include_only)
|
||||
):
|
||||
test_path_list += list(entry_point.load()())
|
||||
return test_path_list
|
||||
|
||||
|
||||
# Run pytest on all tests registered by the PyInstaller setuptools testing entry point. If provided,
|
||||
# the ``include_only`` argument is passed to ``path_to_test``.
|
||||
def run_pytest(*args, **kwargs):
|
||||
paths = paths_to_test(include_only=kwargs.pop("include_only", None))
|
||||
# Return an error code if no tests were discovered.
|
||||
if not paths:
|
||||
print("Error: no tests discovered.", file=sys.stderr)
|
||||
# This indicates no tests were discovered; see
|
||||
# https://docs.pytest.org/en/latest/usage.html#possible-exit-codes.
|
||||
return 5
|
||||
else:
|
||||
# See https://docs.pytest.org/en/latest/usage.html#calling-pytest-from-python-code.
|
||||
# Omit ``args[0]``, which is the name of this script.
|
||||
print("pytest " + " ".join([*paths, *args[1:]]))
|
||||
return pytest.main([*paths, *args[1:]], **kwargs)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Look only for the ``--include_only`` argument.
|
||||
parser = argparse.ArgumentParser(description='Run PyInstaller packaging tests.')
|
||||
parser.add_argument(
|
||||
"--include_only",
|
||||
action="append",
|
||||
help="Only run tests from the specified package.",
|
||||
)
|
||||
args, unknown = parser.parse_known_args(sys.argv)
|
||||
# Convert the parsed args into a dict using ``vars(args)``.
|
||||
sys.exit(run_pytest(*unknown, **vars(args)))
|
||||
112
Utils/PythonNew32/Lib/site-packages/PyInstaller/utils/tests.py
Normal file
112
Utils/PythonNew32/Lib/site-packages/PyInstaller/utils/tests.py
Normal file
@@ -0,0 +1,112 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2005-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
Decorators for skipping PyInstaller tests when specific requirements are not met.
|
||||
"""
|
||||
|
||||
import inspect
|
||||
import sys
|
||||
import textwrap
|
||||
|
||||
import pytest
|
||||
|
||||
from PyInstaller.utils.hooks import check_requirement
|
||||
|
||||
# Wrap some pytest decorators to be consistent in tests.
|
||||
parametrize = pytest.mark.parametrize
|
||||
skipif = pytest.mark.skipif
|
||||
xfail = pytest.mark.xfail
|
||||
skip = pytest.mark.skip
|
||||
|
||||
# Use these decorators to use the `pyi_builder` fixture only in onedir or only in onefile mode instead of both.
|
||||
onedir_only = pytest.mark.parametrize('pyi_builder', ['onedir'], indirect=True)
|
||||
onefile_only = pytest.mark.parametrize('pyi_builder', ['onefile'], indirect=True)
|
||||
|
||||
|
||||
def importorskip(package: str):
|
||||
"""
|
||||
Skip a decorated test if **package** is not importable.
|
||||
|
||||
Arguments:
|
||||
package:
|
||||
The name of the module. May be anything that is allowed after the ``import`` keyword. e.g. 'numpy' or
|
||||
'PIL.Image'.
|
||||
Returns:
|
||||
A pytest marker which either skips the test or does nothing.
|
||||
|
||||
This function intentionally does not import the module. Doing so can lead to `sys.path` and `PATH` being
|
||||
polluted, which then breaks later builds.
|
||||
"""
|
||||
if not importable(package):
|
||||
return pytest.mark.skip(f"Can't import '{package}'.")
|
||||
return pytest.mark.skipif(False, reason=f"Don't skip: '{package}' is importable.")
|
||||
|
||||
|
||||
def importable(package: str):
|
||||
from importlib.util import find_spec
|
||||
|
||||
# The find_spec() function is used by the importlib machinery to locate a module to import. Using it finds the
|
||||
# module but does not run it. Unfortunately, it does import parent modules to check submodules.
|
||||
if "." in package:
|
||||
# Using subprocesses is slow. If the top level module doesn't exist then we can skip it.
|
||||
if not importable(package.split(".")[0]):
|
||||
return False
|
||||
# This is a submodule, import it in isolation.
|
||||
from subprocess import DEVNULL, run
|
||||
return run([sys.executable, "-c", "import " + package], stdout=DEVNULL, stderr=DEVNULL).returncode == 0
|
||||
|
||||
return find_spec(package) is not None
|
||||
|
||||
|
||||
def requires(requirement: str):
|
||||
"""
|
||||
Mark a test to be skipped if **requirement** is not satisfied.
|
||||
|
||||
Args:
|
||||
requirement:
|
||||
A distribution name and optional version specifier(s). See :func:`PyInstaller.utils.hooks.check_requirement`
|
||||
which this argument is forwarded to.
|
||||
Returns:
|
||||
Either a skip marker or a dummy marker.
|
||||
|
||||
This function operates on distribution metadata, and does not import any modules.
|
||||
"""
|
||||
if check_requirement(requirement):
|
||||
return pytest.mark.skipif(False, reason=f"Don't skip: '{requirement}' is satisfied.")
|
||||
else:
|
||||
return pytest.mark.skip(f"Requires {requirement}.")
|
||||
|
||||
|
||||
def gen_sourcefile(tmp_path, source, test_id=None):
|
||||
"""
|
||||
Generate a source file for testing.
|
||||
|
||||
The source will be written into a file named like the test-function. This file will then be passed to
|
||||
`test_script`. If you need other related file, e.g. as `.toc`-file for testing the content, put it at at the
|
||||
normal place. Just mind to take the basnename from the test-function's name.
|
||||
|
||||
:param script: Source code to create executable from. This will be saved into a temporary file which is then
|
||||
passed on to `test_script`.
|
||||
|
||||
:param test_id: Test-id for parametrized tests. If given, it will be appended to the script filename,
|
||||
separated by two underscores.
|
||||
"""
|
||||
testname = inspect.stack()[1][3]
|
||||
if test_id:
|
||||
# For parametrized test append the test-id.
|
||||
testname = testname + '__' + test_id
|
||||
|
||||
# Periods are not allowed in Python module names.
|
||||
testname = testname.replace('.', '_')
|
||||
scriptfile = tmp_path / (testname + '.py')
|
||||
source = textwrap.dedent(source)
|
||||
scriptfile.write_text(source, encoding='utf-8')
|
||||
return scriptfile
|
||||
@@ -0,0 +1 @@
|
||||
__author__ = 'martin'
|
||||
@@ -0,0 +1,251 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
The code in this module supports the --icon parameter on Windows.
|
||||
(For --icon support under macOS, see building/osx.py.)
|
||||
|
||||
The only entry point, called from api.py, is CopyIcons(), below. All the elaborate structure of classes that follows
|
||||
is used to support the operation of CopyIcons_FromIco(). None of these classes and globals are referenced outside
|
||||
this module.
|
||||
"""
|
||||
|
||||
import os
|
||||
import os.path
|
||||
import struct
|
||||
|
||||
import PyInstaller.log as logging
|
||||
from PyInstaller import config
|
||||
from PyInstaller.compat import pywintypes, win32api
|
||||
from PyInstaller.building.icon import normalize_icon_type
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
RT_ICON = 3
|
||||
RT_GROUP_ICON = 14
|
||||
LOAD_LIBRARY_AS_DATAFILE = 2
|
||||
|
||||
|
||||
class Structure:
|
||||
def __init__(self):
|
||||
size = self._sizeInBytes = struct.calcsize(self._format_)
|
||||
self._fields_ = list(struct.unpack(self._format_, b'\000' * size))
|
||||
indexes = self._indexes_ = {}
|
||||
for i, nm in enumerate(self._names_):
|
||||
indexes[nm] = i
|
||||
|
||||
def dump(self):
|
||||
logger.info("DUMP of %s", self)
|
||||
for name in self._names_:
|
||||
if not name.startswith('_'):
|
||||
logger.info("%20s = %s", name, getattr(self, name))
|
||||
logger.info("")
|
||||
|
||||
def __getattr__(self, name):
|
||||
if name in self._names_:
|
||||
index = self._indexes_[name]
|
||||
return self._fields_[index]
|
||||
try:
|
||||
return self.__dict__[name]
|
||||
except KeyError as e:
|
||||
raise AttributeError(name) from e
|
||||
|
||||
def __setattr__(self, name, value):
|
||||
if name in self._names_:
|
||||
index = self._indexes_[name]
|
||||
self._fields_[index] = value
|
||||
else:
|
||||
self.__dict__[name] = value
|
||||
|
||||
def tostring(self):
|
||||
return struct.pack(self._format_, *self._fields_)
|
||||
|
||||
def fromfile(self, file):
|
||||
data = file.read(self._sizeInBytes)
|
||||
self._fields_ = list(struct.unpack(self._format_, data))
|
||||
|
||||
|
||||
class ICONDIRHEADER(Structure):
|
||||
_names_ = "idReserved", "idType", "idCount"
|
||||
_format_ = "hhh"
|
||||
|
||||
|
||||
class ICONDIRENTRY(Structure):
|
||||
_names_ = ("bWidth", "bHeight", "bColorCount", "bReserved", "wPlanes", "wBitCount", "dwBytesInRes", "dwImageOffset")
|
||||
_format_ = "bbbbhhii"
|
||||
|
||||
|
||||
class GRPICONDIR(Structure):
|
||||
_names_ = "idReserved", "idType", "idCount"
|
||||
_format_ = "hhh"
|
||||
|
||||
|
||||
class GRPICONDIRENTRY(Structure):
|
||||
_names_ = ("bWidth", "bHeight", "bColorCount", "bReserved", "wPlanes", "wBitCount", "dwBytesInRes", "nID")
|
||||
_format_ = "bbbbhhih"
|
||||
|
||||
|
||||
# An IconFile instance is created for each .ico file given.
|
||||
class IconFile:
|
||||
def __init__(self, path):
|
||||
self.path = path
|
||||
try:
|
||||
# The path is from the user parameter, don't trust it.
|
||||
file = open(self.path, "rb")
|
||||
except OSError:
|
||||
# The icon file can't be opened for some reason. Stop the
|
||||
# program with an informative message.
|
||||
raise SystemExit(f'ERROR: Unable to open icon file {self.path}!')
|
||||
with file:
|
||||
self.entries = []
|
||||
self.images = []
|
||||
header = self.header = ICONDIRHEADER()
|
||||
header.fromfile(file)
|
||||
for i in range(header.idCount):
|
||||
entry = ICONDIRENTRY()
|
||||
entry.fromfile(file)
|
||||
self.entries.append(entry)
|
||||
for e in self.entries:
|
||||
file.seek(e.dwImageOffset, 0)
|
||||
self.images.append(file.read(e.dwBytesInRes))
|
||||
|
||||
def grp_icon_dir(self):
|
||||
return self.header.tostring()
|
||||
|
||||
def grp_icondir_entries(self, id=1):
|
||||
data = b''
|
||||
for entry in self.entries:
|
||||
e = GRPICONDIRENTRY()
|
||||
for n in e._names_[:-1]:
|
||||
setattr(e, n, getattr(entry, n))
|
||||
e.nID = id
|
||||
id = id + 1
|
||||
data = data + e.tostring()
|
||||
return data
|
||||
|
||||
|
||||
def CopyIcons_FromIco(dstpath, srcpath, id=1):
|
||||
"""
|
||||
Use the Win API UpdateResource facility to apply the icon resource(s) to the .exe file.
|
||||
|
||||
:param str dstpath: absolute path of the .exe file being built.
|
||||
:param str srcpath: list of 1 or more .ico file paths
|
||||
"""
|
||||
icons = map(IconFile, srcpath)
|
||||
logger.debug("Copying icons from %s", srcpath)
|
||||
|
||||
hdst = win32api.BeginUpdateResource(dstpath, 0)
|
||||
|
||||
iconid = 1
|
||||
# Each step in the following enumerate() will instantiate an IconFile object, as a result of deferred execution
|
||||
# of the map() above.
|
||||
for i, f in enumerate(icons):
|
||||
data = f.grp_icon_dir()
|
||||
data = data + f.grp_icondir_entries(iconid)
|
||||
win32api.UpdateResource(hdst, RT_GROUP_ICON, i + 1, data)
|
||||
logger.debug("Writing RT_GROUP_ICON %d resource with %d bytes", i + 1, len(data))
|
||||
for data in f.images:
|
||||
win32api.UpdateResource(hdst, RT_ICON, iconid, data)
|
||||
logger.debug("Writing RT_ICON %d resource with %d bytes", iconid, len(data))
|
||||
iconid = iconid + 1
|
||||
|
||||
win32api.EndUpdateResource(hdst, 0)
|
||||
|
||||
|
||||
def CopyIcons(dstpath, srcpath):
|
||||
"""
|
||||
Called from building/api.py to handle icons. If the input was by --icon on the command line, srcpath is a single
|
||||
string. However, it is possible to modify the spec file adding icon=['foo.ico','bar.ico'] to the EXE() statement.
|
||||
In that case, srcpath is a list of strings.
|
||||
|
||||
The string format is either path-to-.ico or path-to-.exe,n for n an integer resource index in the .exe. In either
|
||||
case, the path can be relative or absolute.
|
||||
"""
|
||||
|
||||
if isinstance(srcpath, (str, os.PathLike)):
|
||||
# Just a single string, make it a one-element list.
|
||||
srcpath = [srcpath]
|
||||
# Convert possible PathLike elements to strings to allow the splitter function to work.
|
||||
srcpath = [str(path) for path in srcpath]
|
||||
|
||||
def splitter(s):
|
||||
"""
|
||||
Convert "pathname" to tuple ("pathname", None)
|
||||
Convert "pathname,n" to tuple ("pathname", n)
|
||||
"""
|
||||
try:
|
||||
srcpath, index = s.split(',')
|
||||
return srcpath.strip(), int(index)
|
||||
except ValueError:
|
||||
return s, None
|
||||
|
||||
# split all the items in the list into tuples as above.
|
||||
srcpath = list(map(splitter, srcpath))
|
||||
|
||||
if len(srcpath) > 1:
|
||||
# More than one icon source given. We currently handle multiple icons by calling CopyIcons_FromIco(), which only
|
||||
# allows .ico, but will convert to that format if needed.
|
||||
#
|
||||
# Note that a ",index" on a .ico is just ignored in the single or multiple case.
|
||||
srcs = []
|
||||
for s in srcpath:
|
||||
srcs.append(normalize_icon_type(s[0], ("ico",), "ico", config.CONF["workpath"]))
|
||||
return CopyIcons_FromIco(dstpath, srcs)
|
||||
|
||||
# Just one source given.
|
||||
srcpath, index = srcpath[0]
|
||||
|
||||
# Makes sure the icon exists and attempts to convert to the proper format if applicable
|
||||
srcpath = normalize_icon_type(srcpath, ("exe", "ico"), "ico", config.CONF["workpath"])
|
||||
|
||||
srcext = os.path.splitext(srcpath)[1]
|
||||
|
||||
# Handle the simple case of foo.ico, ignoring any index.
|
||||
if srcext.lower() == '.ico':
|
||||
return CopyIcons_FromIco(dstpath, [srcpath])
|
||||
|
||||
# Single source is not .ico, presumably it is .exe (and if not, some error will occur).
|
||||
if index is not None:
|
||||
logger.debug("Copying icon from %s, %d", srcpath, index)
|
||||
else:
|
||||
logger.debug("Copying icons from %s", srcpath)
|
||||
|
||||
try:
|
||||
# Attempt to load the .ico or .exe containing the icon into memory using the same mechanism as if it were a DLL.
|
||||
# If this fails for any reason (for example if the file does not exist or is not a .ico/.exe) then LoadLibraryEx
|
||||
# returns a null handle and win32api raises a unique exception with a win error code and a string.
|
||||
hsrc = win32api.LoadLibraryEx(srcpath, 0, LOAD_LIBRARY_AS_DATAFILE)
|
||||
except pywintypes.error as W32E:
|
||||
# We could continue with no icon (i.e., just return), but it seems best to terminate the build with a message.
|
||||
raise SystemExit(
|
||||
"ERROR: Unable to load icon file {}\n {} (Error code {})".format(srcpath, W32E.strerror, W32E.winerror)
|
||||
)
|
||||
hdst = win32api.BeginUpdateResource(dstpath, 0)
|
||||
if index is None:
|
||||
grpname = win32api.EnumResourceNames(hsrc, RT_GROUP_ICON)[0]
|
||||
elif index >= 0:
|
||||
grpname = win32api.EnumResourceNames(hsrc, RT_GROUP_ICON)[index]
|
||||
else:
|
||||
grpname = -index
|
||||
data = win32api.LoadResource(hsrc, RT_GROUP_ICON, grpname)
|
||||
win32api.UpdateResource(hdst, RT_GROUP_ICON, grpname, data)
|
||||
for iconname in win32api.EnumResourceNames(hsrc, RT_ICON):
|
||||
data = win32api.LoadResource(hsrc, RT_ICON, iconname)
|
||||
win32api.UpdateResource(hdst, RT_ICON, iconname, data)
|
||||
win32api.FreeLibrary(hsrc)
|
||||
win32api.EndUpdateResource(hdst, 0)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
dstpath = sys.argv[1]
|
||||
srcpath = sys.argv[2:]
|
||||
CopyIcons(dstpath, srcpath)
|
||||
@@ -0,0 +1,604 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
|
||||
import struct
|
||||
|
||||
import pefile
|
||||
|
||||
from PyInstaller.compat import win32api
|
||||
|
||||
|
||||
def pefile_check_control_flow_guard(filename):
|
||||
"""
|
||||
Checks if the specified PE file has CFG (Control Flow Guard) enabled.
|
||||
|
||||
Parameters
|
||||
----------
|
||||
filename : str
|
||||
Path to the PE file to inspect.
|
||||
|
||||
Returns
|
||||
----------
|
||||
bool
|
||||
True if file is a PE file with CFG enabled. False if CFG is not enabled or if file could not be processed using
|
||||
the pefile library.
|
||||
"""
|
||||
try:
|
||||
pe = pefile.PE(filename, fast_load=True)
|
||||
# https://docs.microsoft.com/en-us/windows/win32/debug/pe-format
|
||||
# IMAGE_DLLCHARACTERISTICS_GUARD_CF = 0x4000
|
||||
return bool(pe.OPTIONAL_HEADER.DllCharacteristics & 0x4000)
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
# Ensures no code from the executable is executed.
|
||||
LOAD_LIBRARY_AS_DATAFILE = 2
|
||||
|
||||
|
||||
def getRaw(text):
|
||||
"""
|
||||
Encodes text as UTF-16LE (Microsoft 'Unicode') for use in structs.
|
||||
"""
|
||||
return text.encode('UTF-16LE')
|
||||
|
||||
|
||||
def read_version_info_from_executable(exe_filename):
|
||||
"""
|
||||
Read the version information structure from the given executable's resources, and return it as an instance of
|
||||
`VSVersionInfo` structure.
|
||||
"""
|
||||
h = win32api.LoadLibraryEx(exe_filename, 0, LOAD_LIBRARY_AS_DATAFILE)
|
||||
res = win32api.EnumResourceNames(h, pefile.RESOURCE_TYPE['RT_VERSION'])
|
||||
if not len(res):
|
||||
return None
|
||||
data = win32api.LoadResource(h, pefile.RESOURCE_TYPE['RT_VERSION'], res[0])
|
||||
info = VSVersionInfo()
|
||||
info.fromRaw(data)
|
||||
win32api.FreeLibrary(h)
|
||||
return info
|
||||
|
||||
|
||||
def nextDWord(offset):
|
||||
"""
|
||||
Align `offset` to the next 4-byte boundary.
|
||||
"""
|
||||
return ((offset + 3) >> 2) << 2
|
||||
|
||||
|
||||
class VSVersionInfo:
|
||||
"""
|
||||
WORD wLength; // length of the VS_VERSION_INFO structure
|
||||
WORD wValueLength; // length of the Value member
|
||||
WORD wType; // 1 means text, 0 means binary
|
||||
WCHAR szKey[]; // Contains the Unicode string "VS_VERSION_INFO".
|
||||
WORD Padding1[];
|
||||
VS_FIXEDFILEINFO Value;
|
||||
WORD Padding2[];
|
||||
WORD Children[]; // zero or more StringFileInfo or VarFileInfo
|
||||
// structures (or both) that are children of the
|
||||
// current version structure.
|
||||
"""
|
||||
def __init__(self, ffi=None, kids=None):
|
||||
self.ffi = ffi
|
||||
self.kids = kids or []
|
||||
|
||||
def fromRaw(self, data):
|
||||
i, (sublen, vallen, wType, nm) = parseCommon(data)
|
||||
#vallen is length of the ffi, typ is 0, nm is 'VS_VERSION_INFO'.
|
||||
i = nextDWord(i)
|
||||
# Now a VS_FIXEDFILEINFO
|
||||
self.ffi = FixedFileInfo()
|
||||
j = self.ffi.fromRaw(data, i)
|
||||
i = j
|
||||
while i < sublen:
|
||||
j = i
|
||||
i, (csublen, cvallen, ctyp, nm) = parseCommon(data, i)
|
||||
if nm.strip() == 'StringFileInfo':
|
||||
sfi = StringFileInfo()
|
||||
k = sfi.fromRaw(csublen, cvallen, nm, data, i, j + csublen)
|
||||
self.kids.append(sfi)
|
||||
i = k
|
||||
else:
|
||||
vfi = VarFileInfo()
|
||||
k = vfi.fromRaw(csublen, cvallen, nm, data, i, j + csublen)
|
||||
self.kids.append(vfi)
|
||||
i = k
|
||||
i = j + csublen
|
||||
i = nextDWord(i)
|
||||
return i
|
||||
|
||||
def toRaw(self):
|
||||
raw_name = getRaw('VS_VERSION_INFO')
|
||||
rawffi = self.ffi.toRaw()
|
||||
vallen = len(rawffi)
|
||||
typ = 0
|
||||
sublen = 6 + len(raw_name) + 2
|
||||
pad = b''
|
||||
if sublen % 4:
|
||||
pad = b'\000\000'
|
||||
sublen = sublen + len(pad) + vallen
|
||||
pad2 = b''
|
||||
if sublen % 4:
|
||||
pad2 = b'\000\000'
|
||||
tmp = b''.join([kid.toRaw() for kid in self.kids])
|
||||
sublen = sublen + len(pad2) + len(tmp)
|
||||
return struct.pack('HHH', sublen, vallen, typ) + raw_name + b'\000\000' + pad + rawffi + pad2 + tmp
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.toRaw() == other
|
||||
|
||||
def __str__(self, indent=''):
|
||||
indent = indent + ' '
|
||||
tmp = [kid.__str__(indent + ' ') for kid in self.kids]
|
||||
tmp = ', \n'.join(tmp)
|
||||
return '\n'.join([
|
||||
"# UTF-8",
|
||||
"#",
|
||||
"# For more details about fixed file info 'ffi' see:",
|
||||
"# http://msdn.microsoft.com/en-us/library/ms646997.aspx",
|
||||
"VSVersionInfo(",
|
||||
indent + f"ffi={self.ffi.__str__(indent)},",
|
||||
indent + "kids=[",
|
||||
tmp,
|
||||
indent + "]",
|
||||
")",
|
||||
])
|
||||
|
||||
def __repr__(self):
|
||||
return "versioninfo.VSVersionInfo(ffi=%r, kids=%r)" % (self.ffi, self.kids)
|
||||
|
||||
|
||||
def parseCommon(data, start=0):
|
||||
i = start + 6
|
||||
(wLength, wValueLength, wType) = struct.unpack('3H', data[start:i])
|
||||
i, text = parseUString(data, i, i + wLength)
|
||||
return i, (wLength, wValueLength, wType, text)
|
||||
|
||||
|
||||
def parseUString(data, start, limit):
|
||||
i = start
|
||||
while i < limit:
|
||||
if data[i:i + 2] == b'\000\000':
|
||||
break
|
||||
i += 2
|
||||
text = data[start:i].decode('UTF-16LE')
|
||||
i += 2
|
||||
return i, text
|
||||
|
||||
|
||||
class FixedFileInfo:
|
||||
"""
|
||||
DWORD dwSignature; //Contains the value 0xFEEFO4BD
|
||||
DWORD dwStrucVersion; // binary version number of this structure.
|
||||
// The high-order word of this member contains
|
||||
// the major version number, and the low-order
|
||||
// word contains the minor version number.
|
||||
DWORD dwFileVersionMS; // most significant 32 bits of the file's binary
|
||||
// version number
|
||||
DWORD dwFileVersionLS; //
|
||||
DWORD dwProductVersionMS; // most significant 32 bits of the binary version
|
||||
// number of the product with which this file was
|
||||
// distributed
|
||||
DWORD dwProductVersionLS; //
|
||||
DWORD dwFileFlagsMask; // bitmask that specifies the valid bits in
|
||||
// dwFileFlags. A bit is valid only if it was
|
||||
// defined when the file was created.
|
||||
DWORD dwFileFlags; // VS_FF_DEBUG, VS_FF_PATCHED etc.
|
||||
DWORD dwFileOS; // VOS_NT, VOS_WINDOWS32 etc.
|
||||
DWORD dwFileType; // VFT_APP etc.
|
||||
DWORD dwFileSubtype; // 0 unless VFT_DRV or VFT_FONT or VFT_VXD
|
||||
DWORD dwFileDateMS;
|
||||
DWORD dwFileDateLS;
|
||||
"""
|
||||
def __init__(
|
||||
self,
|
||||
filevers=(0, 0, 0, 0),
|
||||
prodvers=(0, 0, 0, 0),
|
||||
mask=0x3f,
|
||||
flags=0x0,
|
||||
OS=0x40004,
|
||||
fileType=0x1,
|
||||
subtype=0x0,
|
||||
date=(0, 0)
|
||||
):
|
||||
self.sig = 0xfeef04bd
|
||||
self.strucVersion = 0x10000
|
||||
self.fileVersionMS = (filevers[0] << 16) | (filevers[1] & 0xffff)
|
||||
self.fileVersionLS = (filevers[2] << 16) | (filevers[3] & 0xffff)
|
||||
self.productVersionMS = (prodvers[0] << 16) | (prodvers[1] & 0xffff)
|
||||
self.productVersionLS = (prodvers[2] << 16) | (prodvers[3] & 0xffff)
|
||||
self.fileFlagsMask = mask
|
||||
self.fileFlags = flags
|
||||
self.fileOS = OS
|
||||
self.fileType = fileType
|
||||
self.fileSubtype = subtype
|
||||
self.fileDateMS = date[0]
|
||||
self.fileDateLS = date[1]
|
||||
|
||||
def fromRaw(self, data, i):
|
||||
(
|
||||
self.sig,
|
||||
self.strucVersion,
|
||||
self.fileVersionMS,
|
||||
self.fileVersionLS,
|
||||
self.productVersionMS,
|
||||
self.productVersionLS,
|
||||
self.fileFlagsMask,
|
||||
self.fileFlags,
|
||||
self.fileOS,
|
||||
self.fileType,
|
||||
self.fileSubtype,
|
||||
self.fileDateMS,
|
||||
self.fileDateLS,
|
||||
) = struct.unpack('13L', data[i:i + 52])
|
||||
return i + 52
|
||||
|
||||
def toRaw(self):
|
||||
return struct.pack(
|
||||
'13L',
|
||||
self.sig,
|
||||
self.strucVersion,
|
||||
self.fileVersionMS,
|
||||
self.fileVersionLS,
|
||||
self.productVersionMS,
|
||||
self.productVersionLS,
|
||||
self.fileFlagsMask,
|
||||
self.fileFlags,
|
||||
self.fileOS,
|
||||
self.fileType,
|
||||
self.fileSubtype,
|
||||
self.fileDateMS,
|
||||
self.fileDateLS,
|
||||
)
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.toRaw() == other
|
||||
|
||||
def __str__(self, indent=''):
|
||||
fv = (
|
||||
self.fileVersionMS >> 16, self.fileVersionMS & 0xffff,
|
||||
self.fileVersionLS >> 16, self.fileVersionLS & 0xffff,
|
||||
) # yapf: disable
|
||||
pv = (
|
||||
self.productVersionMS >> 16, self.productVersionMS & 0xffff,
|
||||
self.productVersionLS >> 16, self.productVersionLS & 0xffff,
|
||||
) # yapf: disable
|
||||
fd = (self.fileDateMS, self.fileDateLS)
|
||||
tmp = [
|
||||
'FixedFileInfo(',
|
||||
'# filevers and prodvers should be always a tuple with four items: (1, 2, 3, 4)',
|
||||
'# Set not needed items to zero 0.',
|
||||
'filevers=%s,' % (fv,),
|
||||
'prodvers=%s,' % (pv,),
|
||||
"# Contains a bitmask that specifies the valid bits 'flags'r",
|
||||
'mask=%s,' % hex(self.fileFlagsMask),
|
||||
'# Contains a bitmask that specifies the Boolean attributes of the file.',
|
||||
'flags=%s,' % hex(self.fileFlags),
|
||||
'# The operating system for which this file was designed.',
|
||||
'# 0x4 - NT and there is no need to change it.',
|
||||
'OS=%s,' % hex(self.fileOS),
|
||||
'# The general type of file.',
|
||||
'# 0x1 - the file is an application.',
|
||||
'fileType=%s,' % hex(self.fileType),
|
||||
'# The function of the file.',
|
||||
'# 0x0 - the function is not defined for this fileType',
|
||||
'subtype=%s,' % hex(self.fileSubtype),
|
||||
'# Creation date and time stamp.',
|
||||
'date=%s' % (fd,),
|
||||
')',
|
||||
]
|
||||
return f'\n{indent} '.join(tmp)
|
||||
|
||||
def __repr__(self):
|
||||
fv = (
|
||||
self.fileVersionMS >> 16, self.fileVersionMS & 0xffff,
|
||||
self.fileVersionLS >> 16, self.fileVersionLS & 0xffff,
|
||||
) # yapf: disable
|
||||
pv = (
|
||||
self.productVersionMS >> 16, self.productVersionMS & 0xffff,
|
||||
self.productVersionLS >> 16, self.productVersionLS & 0xffff,
|
||||
) # yapf: disable
|
||||
fd = (self.fileDateMS, self.fileDateLS)
|
||||
return (
|
||||
'versioninfo.FixedFileInfo(filevers=%r, prodvers=%r, '
|
||||
'mask=0x%x, flags=0x%x, OS=0x%x, '
|
||||
'fileType=%r, subtype=0x%x, date=%r)' %
|
||||
(fv, pv, self.fileFlagsMask, self.fileFlags, self.fileOS, self.fileType, self.fileSubtype, fd)
|
||||
)
|
||||
|
||||
|
||||
class StringFileInfo:
|
||||
"""
|
||||
WORD wLength; // length of the version resource
|
||||
WORD wValueLength; // length of the Value member in the current
|
||||
// VS_VERSION_INFO structure
|
||||
WORD wType; // 1 means text, 0 means binary
|
||||
WCHAR szKey[]; // Contains the Unicode string 'StringFileInfo'.
|
||||
WORD Padding[];
|
||||
StringTable Children[]; // list of zero or more String structures
|
||||
"""
|
||||
def __init__(self, kids=None):
|
||||
self.name = 'StringFileInfo'
|
||||
self.kids = kids or []
|
||||
|
||||
def fromRaw(self, sublen, vallen, name, data, i, limit):
|
||||
self.name = name
|
||||
while i < limit:
|
||||
st = StringTable()
|
||||
j = st.fromRaw(data, i, limit)
|
||||
self.kids.append(st)
|
||||
i = j
|
||||
return i
|
||||
|
||||
def toRaw(self):
|
||||
raw_name = getRaw(self.name)
|
||||
vallen = 0
|
||||
typ = 1
|
||||
sublen = 6 + len(raw_name) + 2
|
||||
pad = b''
|
||||
if sublen % 4:
|
||||
pad = b'\000\000'
|
||||
tmp = b''.join([kid.toRaw() for kid in self.kids])
|
||||
sublen = sublen + len(pad) + len(tmp)
|
||||
return struct.pack('HHH', sublen, vallen, typ) + raw_name + b'\000\000' + pad + tmp
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.toRaw() == other
|
||||
|
||||
def __str__(self, indent=''):
|
||||
new_indent = indent + ' '
|
||||
tmp = ', \n'.join(kid.__str__(new_indent) for kid in self.kids)
|
||||
return f'{indent}StringFileInfo(\n{new_indent}[\n{tmp}\n{new_indent}])'
|
||||
|
||||
def __repr__(self):
|
||||
return 'versioninfo.StringFileInfo(%r)' % self.kids
|
||||
|
||||
|
||||
class StringTable:
|
||||
"""
|
||||
WORD wLength;
|
||||
WORD wValueLength;
|
||||
WORD wType;
|
||||
WCHAR szKey[];
|
||||
String Children[]; // list of zero or more String structures.
|
||||
"""
|
||||
def __init__(self, name=None, kids=None):
|
||||
self.name = name or ''
|
||||
self.kids = kids or []
|
||||
|
||||
def fromRaw(self, data, i, limit):
|
||||
i, (cpsublen, cpwValueLength, cpwType, self.name) = parseCodePage(data, i, limit) # should be code page junk
|
||||
i = nextDWord(i)
|
||||
while i < limit:
|
||||
ss = StringStruct()
|
||||
j = ss.fromRaw(data, i, limit)
|
||||
i = j
|
||||
self.kids.append(ss)
|
||||
i = nextDWord(i)
|
||||
return i
|
||||
|
||||
def toRaw(self):
|
||||
raw_name = getRaw(self.name)
|
||||
vallen = 0
|
||||
typ = 1
|
||||
sublen = 6 + len(raw_name) + 2
|
||||
tmp = []
|
||||
for kid in self.kids:
|
||||
raw = kid.toRaw()
|
||||
if len(raw) % 4:
|
||||
raw = raw + b'\000\000'
|
||||
tmp.append(raw)
|
||||
tmp = b''.join(tmp)
|
||||
sublen += len(tmp)
|
||||
return struct.pack('HHH', sublen, vallen, typ) + raw_name + b'\000\000' + tmp
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.toRaw() == other
|
||||
|
||||
def __str__(self, indent=''):
|
||||
new_indent = indent + ' '
|
||||
tmp = (',\n' + new_indent).join(str(kid) for kid in self.kids)
|
||||
return f"{indent}StringTable(\n{new_indent}'{self.name}',\n{new_indent}[{tmp}])"
|
||||
|
||||
def __repr__(self):
|
||||
return 'versioninfo.StringTable(%r, %r)' % (self.name, self.kids)
|
||||
|
||||
|
||||
class StringStruct:
|
||||
"""
|
||||
WORD wLength;
|
||||
WORD wValueLength;
|
||||
WORD wType;
|
||||
WCHAR szKey[];
|
||||
WORD Padding[];
|
||||
String Value[];
|
||||
"""
|
||||
def __init__(self, name=None, val=None):
|
||||
self.name = name or ''
|
||||
self.val = val or ''
|
||||
|
||||
def fromRaw(self, data, i, limit):
|
||||
i, (sublen, vallen, typ, self.name) = parseCommon(data, i)
|
||||
limit = i + sublen
|
||||
i = nextDWord(i)
|
||||
i, self.val = parseUString(data, i, limit)
|
||||
return i
|
||||
|
||||
def toRaw(self):
|
||||
raw_name = getRaw(self.name)
|
||||
raw_val = getRaw(self.val)
|
||||
# TODO: document the size of vallen and sublen.
|
||||
vallen = len(self.val) + 1 # Number of (wide-)characters, not bytes!
|
||||
typ = 1
|
||||
sublen = 6 + len(raw_name) + 2
|
||||
pad = b''
|
||||
if sublen % 4:
|
||||
pad = b'\000\000'
|
||||
sublen = sublen + len(pad) + (vallen * 2)
|
||||
return struct.pack('HHH', sublen, vallen, typ) + raw_name + b'\000\000' + pad + raw_val + b'\000\000'
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.toRaw() == other
|
||||
|
||||
def __str__(self, indent=''):
|
||||
return "StringStruct(%r, %r)" % (self.name, self.val)
|
||||
|
||||
def __repr__(self):
|
||||
return 'versioninfo.StringStruct(%r, %r)' % (self.name, self.val)
|
||||
|
||||
|
||||
def parseCodePage(data, i, limit):
|
||||
i, (sublen, wValueLength, wType, nm) = parseCommon(data, i)
|
||||
return i, (sublen, wValueLength, wType, nm)
|
||||
|
||||
|
||||
class VarFileInfo:
|
||||
"""
|
||||
WORD wLength; // length of the version resource
|
||||
WORD wValueLength; // length of the Value member in the current
|
||||
// VS_VERSION_INFO structure
|
||||
WORD wType; // 1 means text, 0 means binary
|
||||
WCHAR szKey[]; // Contains the Unicode string 'VarFileInfo'.
|
||||
WORD Padding[];
|
||||
Var Children[]; // list of zero or more Var structures
|
||||
"""
|
||||
def __init__(self, kids=None):
|
||||
self.kids = kids or []
|
||||
|
||||
def fromRaw(self, sublen, vallen, name, data, i, limit):
|
||||
self.sublen = sublen
|
||||
self.vallen = vallen
|
||||
self.name = name
|
||||
i = nextDWord(i)
|
||||
while i < limit:
|
||||
vs = VarStruct()
|
||||
j = vs.fromRaw(data, i, limit)
|
||||
self.kids.append(vs)
|
||||
i = j
|
||||
return i
|
||||
|
||||
def toRaw(self):
|
||||
self.vallen = 0
|
||||
self.wType = 1
|
||||
self.name = 'VarFileInfo'
|
||||
raw_name = getRaw(self.name)
|
||||
sublen = 6 + len(raw_name) + 2
|
||||
pad = b''
|
||||
if sublen % 4:
|
||||
pad = b'\000\000'
|
||||
tmp = b''.join([kid.toRaw() for kid in self.kids])
|
||||
self.sublen = sublen + len(pad) + len(tmp)
|
||||
return struct.pack('HHH', self.sublen, self.vallen, self.wType) + raw_name + b'\000\000' + pad + tmp
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.toRaw() == other
|
||||
|
||||
def __str__(self, indent=''):
|
||||
return indent + "VarFileInfo([%s])" % ', '.join(str(kid) for kid in self.kids)
|
||||
|
||||
def __repr__(self):
|
||||
return 'versioninfo.VarFileInfo(%r)' % self.kids
|
||||
|
||||
|
||||
class VarStruct:
|
||||
"""
|
||||
WORD wLength; // length of the version resource
|
||||
WORD wValueLength; // length of the Value member in the current
|
||||
// VS_VERSION_INFO structure
|
||||
WORD wType; // 1 means text, 0 means binary
|
||||
WCHAR szKey[]; // Contains the Unicode string 'Translation'
|
||||
// or a user-defined key string value
|
||||
WORD Padding[]; //
|
||||
WORD Value[]; // list of one or more values that are language
|
||||
// and code-page identifiers
|
||||
"""
|
||||
def __init__(self, name=None, kids=None):
|
||||
self.name = name or ''
|
||||
self.kids = kids or []
|
||||
|
||||
def fromRaw(self, data, i, limit):
|
||||
i, (self.sublen, self.wValueLength, self.wType, self.name) = parseCommon(data, i)
|
||||
i = nextDWord(i)
|
||||
for j in range(0, self.wValueLength, 2):
|
||||
kid = struct.unpack('H', data[i:i + 2])[0]
|
||||
self.kids.append(kid)
|
||||
i += 2
|
||||
return i
|
||||
|
||||
def toRaw(self):
|
||||
self.wValueLength = len(self.kids) * 2
|
||||
self.wType = 0
|
||||
raw_name = getRaw(self.name)
|
||||
sublen = 6 + len(raw_name) + 2
|
||||
pad = b''
|
||||
if sublen % 4:
|
||||
pad = b'\000\000'
|
||||
self.sublen = sublen + len(pad) + self.wValueLength
|
||||
tmp = b''.join([struct.pack('H', kid) for kid in self.kids])
|
||||
return struct.pack('HHH', self.sublen, self.wValueLength, self.wType) + raw_name + b'\000\000' + pad + tmp
|
||||
|
||||
def __eq__(self, other):
|
||||
return self.toRaw() == other
|
||||
|
||||
def __str__(self, indent=''):
|
||||
return "VarStruct('%s', %r)" % (self.name, self.kids)
|
||||
|
||||
def __repr__(self):
|
||||
return 'versioninfo.VarStruct(%r, %r)' % (self.name, self.kids)
|
||||
|
||||
|
||||
def load_version_info_from_text_file(filename):
|
||||
"""
|
||||
Load the `VSVersionInfo` structure from its string-based (`VSVersionInfo.__str__`) serialization by reading the
|
||||
text from the file and running it through `eval()`.
|
||||
"""
|
||||
|
||||
# Read and parse the version file. It may have a byte order marker or encoding cookie - respect it if it does.
|
||||
import PyInstaller.utils.misc as miscutils
|
||||
with open(filename, 'rb') as fp:
|
||||
text = miscutils.decode(fp.read())
|
||||
|
||||
# Deserialize via eval()
|
||||
try:
|
||||
info = eval(text)
|
||||
except Exception as e:
|
||||
raise ValueError("Failed to deserialize VSVersionInfo from text-based representation!") from e
|
||||
|
||||
# Sanity check
|
||||
assert isinstance(info, VSVersionInfo), \
|
||||
f"Loaded incompatible structure type! Expected VSVersionInfo, got: {type(info)!r}"
|
||||
|
||||
return info
|
||||
|
||||
|
||||
def write_version_info_to_executable(exe_filename, info):
|
||||
assert isinstance(info, VSVersionInfo)
|
||||
|
||||
# Remember overlay
|
||||
pe = pefile.PE(exe_filename, fast_load=True)
|
||||
overlay_before = pe.get_overlay()
|
||||
pe.close()
|
||||
|
||||
hdst = win32api.BeginUpdateResource(exe_filename, 0)
|
||||
win32api.UpdateResource(hdst, pefile.RESOURCE_TYPE['RT_VERSION'], 1, info.toRaw())
|
||||
win32api.EndUpdateResource(hdst, 0)
|
||||
|
||||
if overlay_before:
|
||||
# Check if the overlay is still present
|
||||
pe = pefile.PE(exe_filename, fast_load=True)
|
||||
overlay_after = pe.get_overlay()
|
||||
pe.close()
|
||||
|
||||
# If the update removed the overlay data, re-append it
|
||||
if not overlay_after:
|
||||
with open(exe_filename, 'ab') as exef:
|
||||
exef.write(overlay_before)
|
||||
@@ -0,0 +1,244 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
import xml.dom
|
||||
import xml.dom.minidom
|
||||
|
||||
#- Relevant constants from Windows headers
|
||||
# Manifest resource code
|
||||
RT_MANIFEST = 24
|
||||
|
||||
# Resource IDs (names) for manifest.
|
||||
# See: https://www.gamedev.net/blogs/entry/2154553-manifest-embedding-and-activation
|
||||
CREATEPROCESS_MANIFEST_RESOURCE_ID = 1
|
||||
ISOLATIONAWARE_MANIFEST_RESOURCE_ID = 2
|
||||
|
||||
LANG_NEUTRAL = 0
|
||||
|
||||
#- Default application manifest template, based on the one found in python executable.
|
||||
|
||||
_DEFAULT_MANIFEST_XML = \
|
||||
b"""<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
||||
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
|
||||
<trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
|
||||
<security>
|
||||
<requestedPrivileges>
|
||||
<requestedExecutionLevel level="asInvoker" uiAccess="false"></requestedExecutionLevel>
|
||||
</requestedPrivileges>
|
||||
</security>
|
||||
</trustInfo>
|
||||
<compatibility xmlns="urn:schemas-microsoft-com:compatibility.v1">
|
||||
<application>
|
||||
<supportedOS Id="{e2011457-1546-43c5-a5fe-008deee3d3f0}"></supportedOS>
|
||||
<supportedOS Id="{35138b9a-5d96-4fbd-8e2d-a2440225f93a}"></supportedOS>
|
||||
<supportedOS Id="{4a2f28e3-53b9-4441-ba9c-d69d4a4a6e38}"></supportedOS>
|
||||
<supportedOS Id="{1f676c76-80e1-4239-95bb-83d0f6d0da78}"></supportedOS>
|
||||
<supportedOS Id="{8e0f7a12-bfb3-4fe8-b9a5-48fd50a15a9a}"></supportedOS>
|
||||
</application>
|
||||
</compatibility>
|
||||
<application xmlns="urn:schemas-microsoft-com:asm.v3">
|
||||
<windowsSettings>
|
||||
<longPathAware xmlns="http://schemas.microsoft.com/SMI/2016/WindowsSettings">true</longPathAware>
|
||||
</windowsSettings>
|
||||
</application>
|
||||
<dependency>
|
||||
<dependentAssembly>
|
||||
<assemblyIdentity type="win32" name="Microsoft.Windows.Common-Controls" version="6.0.0.0" processorArchitecture="*" publicKeyToken="6595b64144ccf1df" language="*"></assemblyIdentity>
|
||||
</dependentAssembly>
|
||||
</dependency>
|
||||
</assembly>
|
||||
""" # noqa: E122,E501
|
||||
|
||||
#- DOM navigation helpers
|
||||
|
||||
|
||||
def _find_elements_by_tag(root, tag):
|
||||
"""
|
||||
Find all elements with given tag under the given root element.
|
||||
"""
|
||||
return [node for node in root.childNodes if node.nodeType == xml.dom.Node.ELEMENT_NODE and node.tagName == tag]
|
||||
|
||||
|
||||
def _find_element_by_tag(root, tag):
|
||||
"""
|
||||
Attempt to find a single element with given tag under the given root element, and return None if no such element
|
||||
is found. Raises an error if multiple elements are found.
|
||||
"""
|
||||
elements = _find_elements_by_tag(root, tag)
|
||||
if len(elements) > 1:
|
||||
raise ValueError(f"Expected a single {tag!r} element, found {len(elements)} element(s)!")
|
||||
if not elements:
|
||||
return None
|
||||
return elements[0]
|
||||
|
||||
|
||||
#- Application manifest modification helpers
|
||||
|
||||
|
||||
def _set_execution_level(manifest_dom, root_element, uac_admin=False, uac_uiaccess=False):
|
||||
"""
|
||||
Find <security> -> <requestedPrivileges> -> <requestedExecutionLevel> element, and set its `level` and `uiAccess`
|
||||
attributes based on supplied arguments. Create the XML elements if necessary, as they are optional.
|
||||
"""
|
||||
|
||||
# <trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
|
||||
trust_info_element = _find_element_by_tag(root_element, "trustInfo")
|
||||
if not trust_info_element:
|
||||
trust_info_element = manifest_dom.createElement("trustInfo")
|
||||
trust_info_element.setAttribute("xmlns", "urn:schemas-microsoft-com:asm.v3")
|
||||
root_element.appendChild(trust_info_element)
|
||||
|
||||
# <security>
|
||||
security_element = _find_element_by_tag(trust_info_element, "security")
|
||||
if not security_element:
|
||||
security_element = manifest_dom.createElement("security")
|
||||
trust_info_element.appendChild(security_element)
|
||||
|
||||
# <requestedPrivileges>
|
||||
requested_privileges_element = _find_element_by_tag(security_element, "requestedPrivileges")
|
||||
if not requested_privileges_element:
|
||||
requested_privileges_element = manifest_dom.createElement("requestedPrivileges")
|
||||
security_element.appendChild(requested_privileges_element)
|
||||
|
||||
# <requestedExecutionLevel>
|
||||
requested_execution_level_element = _find_element_by_tag(requested_privileges_element, "requestedExecutionLevel")
|
||||
if not requested_execution_level_element:
|
||||
requested_execution_level_element = manifest_dom.createElement("requestedExecutionLevel")
|
||||
requested_privileges_element.appendChild(requested_execution_level_element)
|
||||
|
||||
requested_execution_level_element.setAttribute("level", "requireAdministrator" if uac_admin else "asInvoker")
|
||||
requested_execution_level_element.setAttribute("uiAccess", "true" if uac_uiaccess else "false")
|
||||
|
||||
|
||||
def _ensure_common_controls_dependency(manifest_dom, root_element):
|
||||
"""
|
||||
Scan <dependency> elements for the one whose <<dependentAssembly> -> <assemblyIdentity> corresponds to the
|
||||
`Microsoft.Windows.Common-Controls`. If found, overwrite its properties. If not, create new <dependency>
|
||||
element with corresponding sub-elements and attributes.
|
||||
"""
|
||||
|
||||
# <dependency>
|
||||
dependency_elements = _find_elements_by_tag(root_element, "dependency")
|
||||
for dependency_element in dependency_elements:
|
||||
# <dependentAssembly>
|
||||
dependent_assembly_element = _find_element_by_tag(dependency_element, "dependentAssembly")
|
||||
# <assemblyIdentity>
|
||||
assembly_identity_element = _find_element_by_tag(dependent_assembly_element, "assemblyIdentity")
|
||||
# Check the name attribute
|
||||
if assembly_identity_element.attributes["name"].value == "Microsoft.Windows.Common-Controls":
|
||||
common_controls_element = assembly_identity_element
|
||||
break
|
||||
else:
|
||||
# Create <dependency>
|
||||
dependency_element = manifest_dom.createElement("dependency")
|
||||
root_element.appendChild(dependency_element)
|
||||
# Create <dependentAssembly>
|
||||
dependent_assembly_element = manifest_dom.createElement("dependentAssembly")
|
||||
dependency_element.appendChild(dependent_assembly_element)
|
||||
# Create <assemblyIdentity>
|
||||
common_controls_element = manifest_dom.createElement("assemblyIdentity")
|
||||
dependent_assembly_element.appendChild(common_controls_element)
|
||||
|
||||
common_controls_element.setAttribute("type", "win32")
|
||||
common_controls_element.setAttribute("name", "Microsoft.Windows.Common-Controls")
|
||||
common_controls_element.setAttribute("version", "6.0.0.0")
|
||||
common_controls_element.setAttribute("processorArchitecture", "*")
|
||||
common_controls_element.setAttribute("publicKeyToken", "6595b64144ccf1df")
|
||||
common_controls_element.setAttribute("language", "*")
|
||||
|
||||
|
||||
def create_application_manifest(manifest_xml=None, uac_admin=False, uac_uiaccess=False):
|
||||
"""
|
||||
Create application manifest, from built-in or custom manifest XML template. If provided, `manifest_xml` must be
|
||||
a string or byte string containing XML source. The returned manifest is a byte string, encoded in UTF-8.
|
||||
|
||||
This function sets the attributes of `requestedExecutionLevel` based on provided `uac_admin` and `auc_uiacces`
|
||||
arguments (creating the parent elements in the XML, if necessary). It also scans `dependency` elements for the
|
||||
entry corresponding to `Microsoft.Windows.Common-Controls` and creates or modifies it as necessary.
|
||||
"""
|
||||
|
||||
if manifest_xml is None:
|
||||
manifest_xml = _DEFAULT_MANIFEST_XML
|
||||
|
||||
with xml.dom.minidom.parseString(manifest_xml) as manifest_dom:
|
||||
root_element = manifest_dom.documentElement
|
||||
|
||||
# Validate root element - must be <assembly>
|
||||
assert root_element.tagName == "assembly"
|
||||
assert root_element.namespaceURI == "urn:schemas-microsoft-com:asm.v1"
|
||||
assert root_element.attributes["manifestVersion"].value == "1.0"
|
||||
|
||||
# Modify the manifest
|
||||
_set_execution_level(manifest_dom, root_element, uac_admin, uac_uiaccess)
|
||||
_ensure_common_controls_dependency(manifest_dom, root_element)
|
||||
|
||||
# Create output XML
|
||||
output = manifest_dom.toprettyxml(indent=" ", encoding="UTF-8")
|
||||
|
||||
# Strip extra newlines
|
||||
output = [line for line in output.splitlines() if line.strip()]
|
||||
|
||||
# Replace: `<?xml version="1.0" encoding="UTF-8"?>` with `<?xml version="1.0" encoding="UTF-8" standalone="yes"?>`.
|
||||
# Support for `standalone` was added to `toprettyxml` in python 3.9, so do a manual work around.
|
||||
output[0] = b"""<?xml version="1.0" encoding="UTF-8" standalone="yes"?>"""
|
||||
|
||||
output = b"\n".join(output)
|
||||
|
||||
return output
|
||||
|
||||
|
||||
def write_manifest_to_executable(filename, manifest_xml):
|
||||
"""
|
||||
Write the given manifest XML to the given executable's RT_MANIFEST resource.
|
||||
"""
|
||||
from PyInstaller.utils.win32 import winresource
|
||||
|
||||
# CREATEPROCESS_MANIFEST_RESOURCE_ID is used for manifest resource in executables.
|
||||
# ISOLATIONAWARE_MANIFEST_RESOURCE_ID is used for manifest resources in DLLs.
|
||||
names = [CREATEPROCESS_MANIFEST_RESOURCE_ID]
|
||||
|
||||
# Ensure LANG_NEUTRAL is updated, and also update any other present languages.
|
||||
languages = [LANG_NEUTRAL, "*"]
|
||||
|
||||
winresource.add_or_update_resource(filename, manifest_xml, RT_MANIFEST, names, languages)
|
||||
|
||||
|
||||
def read_manifest_from_executable(filename):
|
||||
"""
|
||||
Read manifest from the given executable."
|
||||
"""
|
||||
from PyInstaller.utils.win32 import winresource
|
||||
|
||||
resources = winresource.get_resources(filename, [RT_MANIFEST])
|
||||
|
||||
# `resources` is a three-level dictionary:
|
||||
# - level 1: resource type (RT_MANIFEST)
|
||||
# - level 2: resource name (CREATEPROCESS_MANIFEST_RESOURCE_ID)
|
||||
# - level 3: resource language (LANG_NEUTRAL)
|
||||
|
||||
# Level 1
|
||||
if RT_MANIFEST not in resources:
|
||||
raise ValueError(f"No RT_MANIFEST resources found in {filename!r}.")
|
||||
resources = resources[RT_MANIFEST]
|
||||
|
||||
# Level 2
|
||||
if CREATEPROCESS_MANIFEST_RESOURCE_ID not in resources:
|
||||
raise ValueError(f"No RT_MANIFEST resource named CREATEPROCESS_MANIFEST_RESOURCE_ID found in {filename!r}.")
|
||||
resources = resources[CREATEPROCESS_MANIFEST_RESOURCE_ID]
|
||||
|
||||
# Level 3
|
||||
# We prefer LANG_NEUTRAL, but allow fall back to the first available entry.
|
||||
if LANG_NEUTRAL in resources:
|
||||
resources = resources[LANG_NEUTRAL]
|
||||
else:
|
||||
resources = next(iter(resources.items()))
|
||||
|
||||
manifest_xml = resources
|
||||
return manifest_xml
|
||||
@@ -0,0 +1,189 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
Read and write resources from/to Win32 PE files.
|
||||
"""
|
||||
|
||||
import PyInstaller.log as logging
|
||||
from PyInstaller.compat import pywintypes, win32api
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
LOAD_LIBRARY_AS_DATAFILE = 2
|
||||
ERROR_BAD_EXE_FORMAT = 193
|
||||
ERROR_RESOURCE_DATA_NOT_FOUND = 1812
|
||||
ERROR_RESOURCE_TYPE_NOT_FOUND = 1813
|
||||
ERROR_RESOURCE_NAME_NOT_FOUND = 1814
|
||||
ERROR_RESOURCE_LANG_NOT_FOUND = 1815
|
||||
|
||||
|
||||
def get_resources(filename, types=None, names=None, languages=None):
|
||||
"""
|
||||
Retrieve resources from the given PE file.
|
||||
|
||||
filename: path to the PE file.
|
||||
types: a list of resource types (integers or strings) to search for (None = all).
|
||||
names: a list of resource names (integers or strings) to search for (None = all).
|
||||
languages: a list of resource languages (integers) to search for (None = all).
|
||||
|
||||
Returns a dictionary of the form {type: {name: {language: data}}}, which might also be empty if no matching
|
||||
resources were found.
|
||||
"""
|
||||
types = set(types) if types is not None else {"*"}
|
||||
names = set(names) if names is not None else {"*"}
|
||||
languages = set(languages) if languages is not None else {"*"}
|
||||
|
||||
output = {}
|
||||
|
||||
# Errors codes for which we swallow exceptions
|
||||
_IGNORE_EXCEPTIONS = {
|
||||
ERROR_RESOURCE_DATA_NOT_FOUND,
|
||||
ERROR_RESOURCE_TYPE_NOT_FOUND,
|
||||
ERROR_RESOURCE_NAME_NOT_FOUND,
|
||||
ERROR_RESOURCE_LANG_NOT_FOUND,
|
||||
}
|
||||
|
||||
# Open file
|
||||
module_handle = win32api.LoadLibraryEx(filename, 0, LOAD_LIBRARY_AS_DATAFILE)
|
||||
|
||||
# Enumerate available resource types
|
||||
try:
|
||||
available_types = win32api.EnumResourceTypes(module_handle)
|
||||
except pywintypes.error as e:
|
||||
if e.args[0] not in _IGNORE_EXCEPTIONS:
|
||||
raise
|
||||
available_types = []
|
||||
|
||||
if "*" not in types:
|
||||
available_types = [res_type for res_type in available_types if res_type in types]
|
||||
|
||||
for res_type in available_types:
|
||||
# Enumerate available names for the resource type.
|
||||
try:
|
||||
available_names = win32api.EnumResourceNames(module_handle, res_type)
|
||||
except pywintypes.error as e:
|
||||
if e.args[0] not in _IGNORE_EXCEPTIONS:
|
||||
raise
|
||||
continue
|
||||
|
||||
if "*" not in names:
|
||||
available_names = [res_name for res_name in available_names if res_name in names]
|
||||
|
||||
for res_name in available_names:
|
||||
# Enumerate available languages for the resource type and name combination.
|
||||
try:
|
||||
available_languages = win32api.EnumResourceLanguages(module_handle, res_type, res_name)
|
||||
except pywintypes.error as e:
|
||||
if e.args[0] not in _IGNORE_EXCEPTIONS:
|
||||
raise
|
||||
continue
|
||||
|
||||
if "*" not in languages:
|
||||
available_languages = [res_lang for res_lang in available_languages if res_lang in languages]
|
||||
|
||||
for res_lang in available_languages:
|
||||
# Read data
|
||||
try:
|
||||
data = win32api.LoadResource(module_handle, res_type, res_name, res_lang)
|
||||
except pywintypes.error as e:
|
||||
if e.args[0] not in _IGNORE_EXCEPTIONS:
|
||||
raise
|
||||
continue
|
||||
|
||||
if res_type not in output:
|
||||
output[res_type] = {}
|
||||
if res_name not in output[res_type]:
|
||||
output[res_type][res_name] = {}
|
||||
output[res_type][res_name][res_lang] = data
|
||||
|
||||
# Close file
|
||||
win32api.FreeLibrary(module_handle)
|
||||
|
||||
return output
|
||||
|
||||
|
||||
def add_or_update_resource(filename, data, res_type, names=None, languages=None):
|
||||
"""
|
||||
Update or add a single resource in the PE file with the given binary data.
|
||||
|
||||
filename: path to the PE file.
|
||||
data: binary data to write to the resource.
|
||||
res_type: resource type to add/update (integer or string).
|
||||
names: a list of resource names (integers or strings) to update (None = all).
|
||||
languages: a list of resource languages (integers) to update (None = all).
|
||||
"""
|
||||
if res_type == "*":
|
||||
raise ValueError("res_type cannot be a wildcard (*)!")
|
||||
|
||||
names = set(names) if names is not None else {"*"}
|
||||
languages = set(languages) if languages is not None else {"*"}
|
||||
|
||||
# Retrieve existing resources, filtered by the given resource type and given resource names and languages.
|
||||
resources = get_resources(filename, [res_type], names, languages)
|
||||
|
||||
# Add res_type, name, language combinations that are not already present
|
||||
resources = resources.get(res_type, {}) # This is now a {name: {language: data}} dictionary
|
||||
|
||||
for res_name in names:
|
||||
if res_name == "*":
|
||||
continue
|
||||
if res_name not in resources:
|
||||
resources[res_name] = {}
|
||||
|
||||
for res_lang in languages:
|
||||
if res_lang == "*":
|
||||
continue
|
||||
if res_lang not in resources[res_name]:
|
||||
resources[res_name][res_lang] = None # Just an indicator
|
||||
|
||||
# Add resource to the target file, overwriting the existing resources with same type, name, language combinations.
|
||||
module_handle = win32api.BeginUpdateResource(filename, 0)
|
||||
for res_name in resources.keys():
|
||||
for res_lang in resources[res_name].keys():
|
||||
win32api.UpdateResource(module_handle, res_type, res_name, data, res_lang)
|
||||
win32api.EndUpdateResource(module_handle, 0)
|
||||
|
||||
|
||||
def copy_resources_from_pe_file(filename, src_filename, types=None, names=None, languages=None):
|
||||
"""
|
||||
Update or add resources in the given PE file by copying them over from the specified source PE file.
|
||||
|
||||
filename: path to the PE file.
|
||||
src_filename: path to the source PE file.
|
||||
types: a list of resource types (integers or strings) to add/update via copy for (None = all).
|
||||
names: a list of resource names (integers or strings) to add/update via copy (None = all).
|
||||
languages: a list of resource languages (integers) to add/update via copy (None = all).
|
||||
"""
|
||||
types = set(types) if types is not None else {"*"}
|
||||
names = set(names) if names is not None else {"*"}
|
||||
languages = set(languages) if languages is not None else {"*"}
|
||||
|
||||
# Retrieve existing resources, filtered by the given resource type and given resource names and languages.
|
||||
resources = get_resources(src_filename, types, names, languages)
|
||||
|
||||
for res_type, resources_for_type in resources.items():
|
||||
if "*" not in types and res_type not in types:
|
||||
continue
|
||||
for res_name, resources_for_type_name in resources_for_type.items():
|
||||
if "*" not in names and res_name not in names:
|
||||
continue
|
||||
for res_lang, data in resources_for_type_name.items():
|
||||
if "*" not in languages and res_lang not in languages:
|
||||
continue
|
||||
add_or_update_resource(filename, data, res_type, [res_name], [res_lang])
|
||||
|
||||
|
||||
def remove_all_resources(filename):
|
||||
"""
|
||||
Remove all resources from the given PE file:
|
||||
"""
|
||||
module_handle = win32api.BeginUpdateResource(filename, True) # bDeleteExistingResources=True
|
||||
win32api.EndUpdateResource(module_handle, False)
|
||||
@@ -0,0 +1,257 @@
|
||||
#-----------------------------------------------------------------------------
|
||||
# Copyright (c) 2013-2023, PyInstaller Development Team.
|
||||
#
|
||||
# Distributed under the terms of the GNU General Public License (version 2
|
||||
# or later) with exception for distributing the bootloader.
|
||||
#
|
||||
# The full license is in the file COPYING.txt, distributed with this software.
|
||||
#
|
||||
# SPDX-License-Identifier: (GPL-2.0-or-later WITH Bootloader-exception)
|
||||
#-----------------------------------------------------------------------------
|
||||
"""
|
||||
Utilities for Windows platform.
|
||||
"""
|
||||
|
||||
from PyInstaller import compat
|
||||
|
||||
|
||||
def get_windows_dir():
|
||||
"""
|
||||
Return the Windows directory, e.g., C:\\Windows.
|
||||
"""
|
||||
windir = compat.win32api.GetWindowsDirectory()
|
||||
if not windir:
|
||||
raise SystemExit("ERROR: Cannot determine Windows directory!")
|
||||
return windir
|
||||
|
||||
|
||||
def get_system_path():
|
||||
"""
|
||||
Return the required Windows system paths.
|
||||
"""
|
||||
sys_dir = compat.win32api.GetSystemDirectory()
|
||||
# Ensure C:\Windows\system32 and C:\Windows directories are always present in PATH variable.
|
||||
# C:\Windows\system32 is valid even for 64-bit Windows. Access do DLLs are transparently redirected to
|
||||
# C:\Windows\syswow64 for 64bit applactions.
|
||||
# See http://msdn.microsoft.com/en-us/library/aa384187(v=vs.85).aspx
|
||||
return [sys_dir, get_windows_dir()]
|
||||
|
||||
|
||||
def get_pe_file_machine_type(filename):
|
||||
"""
|
||||
Return the machine type code from the header of the given PE file.
|
||||
"""
|
||||
import pefile
|
||||
|
||||
with pefile.PE(filename, fast_load=True) as pe:
|
||||
return pe.FILE_HEADER.Machine
|
||||
|
||||
|
||||
def set_exe_build_timestamp(exe_path, timestamp):
|
||||
"""
|
||||
Modifies the executable's build timestamp by updating values in the corresponding PE headers.
|
||||
"""
|
||||
import pefile
|
||||
|
||||
with pefile.PE(exe_path, fast_load=True) as pe:
|
||||
# Manually perform a full load. We need it to load all headers, but specifying it in the constructor triggers
|
||||
# byte statistics gathering that takes forever with large files. So we try to go around that...
|
||||
pe.full_load()
|
||||
|
||||
# Set build timestamp.
|
||||
# See: https://0xc0decafe.com/malware-analyst-guide-to-pe-timestamps
|
||||
timestamp = int(timestamp)
|
||||
# Set timestamp field in FILE_HEADER
|
||||
pe.FILE_HEADER.TimeDateStamp = timestamp
|
||||
# MSVC-compiled executables contain (at least?) one DIRECTORY_ENTRY_DEBUG entry that also contains timestamp
|
||||
# with same value as set in FILE_HEADER. So modify that as well, as long as it is set.
|
||||
debug_entries = getattr(pe, 'DIRECTORY_ENTRY_DEBUG', [])
|
||||
for debug_entry in debug_entries:
|
||||
if debug_entry.struct.TimeDateStamp:
|
||||
debug_entry.struct.TimeDateStamp = timestamp
|
||||
|
||||
# Generate updated EXE data
|
||||
data = pe.write()
|
||||
|
||||
# Rewrite the exe
|
||||
with open(exe_path, 'wb') as fp:
|
||||
fp.write(data)
|
||||
|
||||
|
||||
def update_exe_pe_checksum(exe_path):
|
||||
"""
|
||||
Compute the executable's PE checksum, and write it to PE headers.
|
||||
|
||||
This optional checksum is supposed to protect the executable against corruption but some anti-viral software have
|
||||
taken to flagging anything without it set correctly as malware. See issue #5579.
|
||||
"""
|
||||
import pefile
|
||||
|
||||
# Compute checksum using our equivalent of the MapFileAndCheckSumW - for large files, it is significantly faster
|
||||
# than pure-pyton pefile.PE.generate_checksum(). However, it requires the file to be on disk (i.e., cannot operate
|
||||
# on a memory buffer).
|
||||
try:
|
||||
checksum = compute_exe_pe_checksum(exe_path)
|
||||
except Exception as e:
|
||||
raise RuntimeError("Failed to compute PE checksum!") from e
|
||||
|
||||
# Update the checksum
|
||||
with pefile.PE(exe_path, fast_load=True) as pe:
|
||||
pe.OPTIONAL_HEADER.CheckSum = checksum
|
||||
|
||||
# Generate updated EXE data
|
||||
data = pe.write()
|
||||
|
||||
# Rewrite the exe
|
||||
with open(exe_path, 'wb') as fp:
|
||||
fp.write(data)
|
||||
|
||||
|
||||
def compute_exe_pe_checksum(exe_path):
|
||||
"""
|
||||
This is a replacement for the MapFileAndCheckSumW function. As noted in MSDN documentation, the Microsoft's
|
||||
implementation of MapFileAndCheckSumW internally calls its ASCII variant (MapFileAndCheckSumA), and therefore
|
||||
cannot handle paths that contain characters that are not representable in the current code page.
|
||||
See: https://docs.microsoft.com/en-us/windows/win32/api/imagehlp/nf-imagehlp-mapfileandchecksumw
|
||||
|
||||
This function is based on Wine's implementation of MapFileAndCheckSumW, and due to being based entirely on
|
||||
the pure widechar-API functions, it is not limited by the current code page.
|
||||
"""
|
||||
# ctypes bindings for relevant win32 API functions
|
||||
import ctypes
|
||||
from ctypes import windll, wintypes
|
||||
|
||||
INVALID_HANDLE = wintypes.HANDLE(-1).value
|
||||
|
||||
GetLastError = ctypes.windll.kernel32.GetLastError
|
||||
GetLastError.argtypes = ()
|
||||
GetLastError.restype = wintypes.DWORD
|
||||
|
||||
CloseHandle = windll.kernel32.CloseHandle
|
||||
CloseHandle.argtypes = (
|
||||
wintypes.HANDLE, # hObject
|
||||
)
|
||||
CloseHandle.restype = wintypes.BOOL
|
||||
|
||||
CreateFileW = windll.kernel32.CreateFileW
|
||||
CreateFileW.argtypes = (
|
||||
wintypes.LPCWSTR, # lpFileName
|
||||
wintypes.DWORD, # dwDesiredAccess
|
||||
wintypes.DWORD, # dwShareMode
|
||||
wintypes.LPVOID, # lpSecurityAttributes
|
||||
wintypes.DWORD, # dwCreationDisposition
|
||||
wintypes.DWORD, # dwFlagsAndAttributes
|
||||
wintypes.HANDLE, # hTemplateFile
|
||||
)
|
||||
CreateFileW.restype = wintypes.HANDLE
|
||||
|
||||
CreateFileMappingW = windll.kernel32.CreateFileMappingW
|
||||
CreateFileMappingW.argtypes = (
|
||||
wintypes.HANDLE, # hFile
|
||||
wintypes.LPVOID, # lpSecurityAttributes
|
||||
wintypes.DWORD, # flProtect
|
||||
wintypes.DWORD, # dwMaximumSizeHigh
|
||||
wintypes.DWORD, # dwMaximumSizeLow
|
||||
wintypes.LPCWSTR, # lpName
|
||||
)
|
||||
CreateFileMappingW.restype = wintypes.HANDLE
|
||||
|
||||
MapViewOfFile = windll.kernel32.MapViewOfFile
|
||||
MapViewOfFile.argtypes = (
|
||||
wintypes.HANDLE, # hFileMappingObject
|
||||
wintypes.DWORD, # dwDesiredAccess
|
||||
wintypes.DWORD, # dwFileOffsetHigh
|
||||
wintypes.DWORD, # dwFileOffsetLow
|
||||
wintypes.DWORD, # dwNumberOfBytesToMap
|
||||
)
|
||||
MapViewOfFile.restype = wintypes.LPVOID
|
||||
|
||||
UnmapViewOfFile = windll.kernel32.UnmapViewOfFile
|
||||
UnmapViewOfFile.argtypes = (
|
||||
wintypes.LPCVOID, # lpBaseAddress
|
||||
)
|
||||
UnmapViewOfFile.restype = wintypes.BOOL
|
||||
|
||||
GetFileSizeEx = windll.kernel32.GetFileSizeEx
|
||||
GetFileSizeEx.argtypes = (
|
||||
wintypes.HANDLE, # hFile
|
||||
wintypes.PLARGE_INTEGER, # lpFileSize
|
||||
)
|
||||
|
||||
CheckSumMappedFile = windll.imagehlp.CheckSumMappedFile
|
||||
CheckSumMappedFile.argtypes = (
|
||||
wintypes.LPVOID, # BaseAddress
|
||||
wintypes.DWORD, # FileLength
|
||||
wintypes.PDWORD, # HeaderSum
|
||||
wintypes.PDWORD, # CheckSum
|
||||
)
|
||||
CheckSumMappedFile.restype = wintypes.LPVOID
|
||||
|
||||
# Open file
|
||||
hFile = CreateFileW(
|
||||
ctypes.c_wchar_p(exe_path),
|
||||
0x80000000, # dwDesiredAccess = GENERIC_READ
|
||||
0x00000001 | 0x00000002, # dwShareMode = FILE_SHARE_READ | FILE_SHARE_WRITE,
|
||||
None, # lpSecurityAttributes = NULL
|
||||
3, # dwCreationDisposition = OPEN_EXISTING
|
||||
0x80, # dwFlagsAndAttributes = FILE_ATTRIBUTE_NORMAL
|
||||
None # hTemplateFile = NULL
|
||||
)
|
||||
if hFile == INVALID_HANDLE:
|
||||
err = GetLastError()
|
||||
raise RuntimeError(f"Failed to open file {exe_path}! Error code: {err}")
|
||||
|
||||
# Query file size
|
||||
fileLength = wintypes.LARGE_INTEGER(0)
|
||||
if GetFileSizeEx(hFile, fileLength) == 0:
|
||||
err = GetLastError()
|
||||
CloseHandle(hFile)
|
||||
raise RuntimeError(f"Failed to query file size file! Error code: {err}")
|
||||
fileLength = fileLength.value
|
||||
if fileLength > (2**32 - 1):
|
||||
raise RuntimeError("Executable size exceeds maximum allowed executable size on Windows (4 GiB)!")
|
||||
|
||||
# Map the file
|
||||
hMapping = CreateFileMappingW(
|
||||
hFile,
|
||||
None, # lpFileMappingAttributes = NULL
|
||||
0x02, # flProtect = PAGE_READONLY
|
||||
0, # dwMaximumSizeHigh = 0
|
||||
0, # dwMaximumSizeLow = 0
|
||||
None # lpName = NULL
|
||||
)
|
||||
if not hMapping:
|
||||
err = GetLastError()
|
||||
CloseHandle(hFile)
|
||||
raise RuntimeError(f"Failed to map file! Error code: {err}")
|
||||
|
||||
# Create map view
|
||||
baseAddress = MapViewOfFile(
|
||||
hMapping,
|
||||
4, # dwDesiredAccess = FILE_MAP_READ
|
||||
0, # dwFileOffsetHigh = 0
|
||||
0, # dwFileOffsetLow = 0
|
||||
0 # dwNumberOfBytesToMap = 0
|
||||
)
|
||||
if baseAddress == 0:
|
||||
err = GetLastError()
|
||||
CloseHandle(hMapping)
|
||||
CloseHandle(hFile)
|
||||
raise RuntimeError(f"Failed to create map view! Error code: {err}")
|
||||
|
||||
# Finally, compute the checksum
|
||||
headerSum = wintypes.DWORD(0)
|
||||
checkSum = wintypes.DWORD(0)
|
||||
ret = CheckSumMappedFile(baseAddress, fileLength, ctypes.byref(headerSum), ctypes.byref(checkSum))
|
||||
if ret is None:
|
||||
err = GetLastError()
|
||||
|
||||
# Cleanup
|
||||
UnmapViewOfFile(baseAddress)
|
||||
CloseHandle(hMapping)
|
||||
CloseHandle(hFile)
|
||||
|
||||
if ret is None:
|
||||
raise RuntimeError(f"CheckSumMappedFile failed! Error code: {err}")
|
||||
|
||||
return checkSum.value
|
||||
Reference in New Issue
Block a user