summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2024-09-18undo adaptations for setuptools vendored packagesfeat/uv_pip_compiletoofar
see 433074c6817d, adf39e9f72c2, db83a82fe118 & 78a74a2e2afe Now that we are using pip-compile to build requirements lock files, instead of pip-freeze, these things shouldn't be showing up in the results. Looks like `typeguard` was dropped in a192a3d9c7d0e87 "initial compile based bump" I guess that was being pulled in by pip freeze too.
2024-09-18Enable delightful pip compile lineage annotationstoofar
I find these comments useful to show why packages are included in the final compiled requirements files. Required a small change to `recompile_requirements` to ignore these new comment line (it was saying there was a new requirement with an empty name that needed a changelog entry). The addition of the `os.path.realpath()` is to clean up the paths of the requirements files in the annotations, eg: check-manifest==0.49 - # via -r scripts/dev/../../misc/requirements/requirements-check-manifest.txt-raw + # via -r misc/requirements/requirements-check-manifest.txt-raw
2024-09-18Remove slow test with poor isolationtoofar
It would have been convenient to have an end2end test to make sure that the output of the two requirements file compilation methods had the same results. But I think there is a bit too much stuff going on in `recompile_requirements` for that atm. Making the local repo in the temp path and install fake packages there works fine, although setting up the virtualenv isn't quick. But the script currently installs pip, setuptools and wheel which means we have to either 1) hit pypi.org in the tests to get them, or 2) download them at the start of the test suite and put them in the local repo. Not sure it's worth the effort to go down this rabbit hole when we already have a dozen real requirements files to verify the change with. I'm leaving this in the commit history because it was fun to get the local repo working!
2024-09-18Add pip and setuptools back to tox requirement filetoofar
For the pip freeze backend pip is being passed `--all` when the tox requirements file is being processed so that pip and setuptools are included in the requirements file. This was added in 922dca039b8d88bbca6 for reasons I haven't fully grokked. This commit adds similar behaviour for the pip compile backend via: 1. don't add the --no-emit-package args for the tox requirements file 2. add pip and setuptools to the tox requirements file It seems that pip and setuptools aren't even requirements of tox, but they are being included in the compiled requirements file anyway. Why aren't the included in the raw requirements file? I don't know, but from what I can figure it's not going to harm anything to have them in there.
2024-09-18Exclude setuptools from pip-compile outputtoofar
This is to match the `pip freeze` requirements compilation method. It's not clear to me if we actually want this behaviour or not. If seems `pip freeze` will exclude dependencies of itself: https://pip.pypa.io/en/stable/cli/pip_freeze/#cmdoption-all Even if there are other packages installed that depend on those dependencies. `uv pip compile`, and now the original `pip-compile` both have decided to include setuptools in generated requirements files: https://github.com/astral-sh/uv/issues/1353 https://github.com/jazzband/pip-tools/issues/989#issuecomment-1134985118 So I'm not sure if we have a reason for going against this here or if they were just being excluded because that's what pip freeze does. Hopefully we can drop this commit and use the default behaviour in the future. For now when I'm trying to provide the new backend it's here to make the diff of generated files more precise. This message prefix to identify a pip compile comment was taken from these examples: # The following packages were excluded from the output: # setuptool # The following packages are considered to be unsafe in a requirements file: # setuptools==41.4.0 # via protobuf
2024-09-18initial compile based bumptoofar
This shows the naive difference between the pip install and the pip compile based requirements compilation methods. After this I'll add on a few commits to reduce the diff to only non-functional changes (eg line order). Some of the later commits may undo beneficial behaviour in favour of compatibility with the existing pip install based method.
2024-09-18baseline pip freeze based requirements update runtoofar
2024-09-18normalize package names pip freeze output tootoofar
This lets us more easily compare the output of runs of the different requirements compiling methods.
2024-09-18Add .git-blame-ignore-revs file to ignore normalization committoofar
This should make git blame a bit more useful for when you want to see when the version of a requirements last changed. Tip from https://www.stefanjudis.com/today-i-learned/how-to-exclude-commits-from-git-blame/#fun-fact%3A-github-picks-up-%60.git-blame-ignore-revs%60
2024-09-18normalize package names in requirements lock filestoofar
`pip freeze` writes out package names as specified by the packages, `pip compile` writes out normalized package names. Sticking to normalized names in robot written files lets us more easily compare the output of the two different requirement compiling methods and also mean we can stop worrying about packages changing between `_` and `-` in dependency updates. Script used to do this change was: import re import glob def normalize_pkg(name): """Normalize a package name for comparisons. From https://packaging.python.org/en/latest/specifications/name-normalization/#name-normalization `pip freeze` passes file names though in whatever case they are in in the package, pip-compile will normalize them. """ if "/" in name: # don't change file paths return name return re.sub(r"[-_.]+", "-", name).lower() def normalize_line(line): if not line or not line.strip(): return line if "==" not in line.split()[0]: return line pkg, version = line.split("==", maxsplit=1) return "==".join([normalize_pkg(pkg), version]) for name in ["requirements.txt"] + glob.glob("misc/requirements/requirements*.txt"): with open(name) as f: before_lines = f.readlines() after_lines = [normalize_line(line) for line in before_lines] with open(name, mode="w") as f: f.writelines(after_lines)
2024-09-18Add tests for comments supported by recompile_requirementstoofar
Since I was looking at how hard it would be to support using pip-compile to recompute requirements, I was worried that I would break the markers we support in the raw requirements files. This adds two tests: * disabled_test_markers_real_pip_and_venv A test that sets up a local python index and runs the real pip/uv binaries. It works (and it was fun to setup a package index factory) but has a couple of downsides: 1. it hits the real pypi index, which is not great in a test. This can be prevented by removing the EXTRA bit from the INDEX_URL env vars and pre-downloading pip, wheel, setuptools and uv to the test repo (and adding index.htmls for them). But because of the next item I'm not sure it's worth the effort of keeping this test around 2. it's slow because of having to download packages from the internet (even if we pre-cached them it would still have to download them, I guess we could include a zip of fixed/vendored versions, but that will probably require maintenance over time) and because it cals venv to make new virtual environments, which isn't the quickest operation (maybe uv venv is quicker?) * test_markers_in_comments Tests just the comment reading and line manipulation logic. Could be made a bit more pure by just calling read_comments() and convert_line(), but that wouldn't test that "add" marker.
2024-09-18Combine the two "build_requirements" methodstoofar
There was a fair bit of duplicate code, so I've pulled out the "take a list of requirements, give me a new one" out to separate methods. The stuff around parsing the output can stay common thankfully! Even if we drop one of the methods this level of abstraction is probably fine to keep.
2024-09-18Normalize module level CHANGELOG_URLS for misc checktoofar
The CHANGELOG_URLS variable is imported by the `./scripts/dev/misc_checks.py changelog-urls` check. Since there is a bit of churn in package name case in this change (and a while back when a bunch of packages switched between underscores and hyphens), update this variable at import time so that that checker will be looking at normalized names too.
2024-09-18Use pip compile to find new deps instead of install and freezetoofar
In #8269 we saw some packages leaking into the pip freeze output that we don't want in the requirements files (if setuptools isn't supposed to be in there, why should its dependencies). I've also greatly missed the comments that pip-compile puts in requirements.txt files explaining where indirect dependencies come from. So I took the opportunity to switch our tooling for updating and parsing new dependencies and their versions to use pip-compile instead of `pip -U install && pip freeze`. It turned out to not be a big change because the pip freeze output is largely compatible with requirements files (we are writing directly to one after all). So we just need to switch what commands we are running and triage any compatibility issues. I chose `uv pip compile` instead of `pip-compile` because I like what uv is doing (fast, aiming for compatibility, consolidating a confusing ecosystem into a single tool). But pip-compile/tools should do the same job if we want to go that route. The biggest differences are: * outputs normalized names: this generally results in a larger diff than otherwise (I propose we go through an regenerate all the requirements files in one go, and maybe add that commit to a blame ignore file) and requires our comparison logic to deal with normalized package names everywhere * setuptools and pip not included in tox requirement file - not sure what to do about that yet, should they be in the .text-raw file? TODO: * remove support for pip_args? * change markers in raw files to lower case? Ideally don't require, if a human can write them in any case and a robot can normalize we should do that. If if there are patterns with `._` in them as part of names, how do we handle that? * pull out similar bits of `build_requirements*` methods * maybe make it so you can pass `requirements=None` to `init_venv` to make it not install stuff, install uv, do the uv invocation, gate all that behind a `--method="freeze|compile"` arg? * add pip and setuptools to tox requirements file? * basename requirements file names so they don't have * `script_path/../../` in them in the annotated version * add tests for the markers (with inputs of differing cases) to make sure they all still work * update changelog check used in CI to normalize names too
2024-09-18add changelog for jaraco.collectionstoofar
2024-09-17Update dependenciesqutebrowser bot
2024-09-17Include platformdirs in test requirements as a workaround tootoofar
See the previous commit db83a82fe118c
2024-09-17Include platformdirs in dev requirements as a workaroundtoofar
See 433074c6817daa2, this is the same cause. An older version of a package being included in requirements files because setuptools injects its vendored packages into sys.path and we use pip freeze to build lock files. Then when you install two requirements files at the same time they end up having conflicting versions. This at least means we include the latest version, which will do until we move to a method of generating lock files that just works off of the raw requirements file.
2024-09-06Merge pull request #8293 from qutebrowser/update-dependenciesfix/always_update_requirements_when_recompilingtoofar
Update dependencies
2024-09-06Adjust permission tests for changes to 6.8 permission storage featuretoofar
Qt have updated their permission storage feature so it respects our the setting our basedir feature uses, so now all the tests that use "Given I have a fresh instance" are passing. The remaining failing ones do pass if I make them run in a fresh instance, but I am leaving them as xfail because a) opening a new instance is slow b) the new upstream behaviour causes a regression in the qutebrowser behavior (you don't get re-prompted where you would have been previously) so I feel like it is correct for some tests to be failing! We have to set AskEveryTime at some point and we can address them then.
2024-09-06Dump 6.8 beta4 security patch versiontoofar
2024-09-06Ignore no dictionary errors on CItoofar
The message is: The following paths were searched for Qt WebEngine dictionaries: /tmp/qutebrowser-basedir-qrhbqblr/data/qtwebengine_dictionaries but could not find it. Spellchecking can not be enabled. Tests are failing on "Logged unexpected errors".
2024-09-06fix changelog urlstoofar
2024-09-03Update dependenciesqutebrowser bot
2024-09-03also update jaroco-context as a workaroundtoofar
see previous commit
2024-09-03Add importlib_resources to tests requirements file as workaroundtoofar
Currently the dependency update job is failing[1] because one of the tests installs multiple requirements files before running the tests and it claims they have conflicting versions of `importlib_resources` (6.4.0 vs 6.4.4). 6.4.0 is in the pinned files and there is a 6.4.4 available. Looking though the logs the first time I see importlib_resources==6.4.0 is when printing the requirements for the `test` requirements file. But it's not mentioned at all when installing that file. Which makes me think it found it's way into the virtualenv by some other means. Looking at git blame for the test requirements lock file, it looks like importlib_resources was introduced in https://github.com/qutebrowser/qutebrowser/pull/8269 and indeed I can see version 6.4.0 in setuptools vendored folder[2]. So it looks like this is another issue caused by setuptools adding their vendored packages into sys.path. Options I can see for resolving this: a. add importlib_resources as a dependency in requirements.txt-raw so that we always pull down the newest one, even though we don't need it b. add an @ignore line for importlib_resources * I think in the unlikely event we end up needing it then it being ignored might be hard to spot c. drop python 3.8 support d. switch to a requirements compilation method that doesn't use `pip freeze` I've chosen (a) here because I think it's less surprising than (b), less work than (c) and I already have a PR up for (d). And it's only pulled down for 3.8 anyhow, so we'll drop this workaround when we drop that. [1]: https://github.com/qutebrowser/qutebrowser/actions/runs/10660624684/job/29544897516 [2]: https://github.com/pypa/setuptools/tree/main/setuptools/_vendor
2024-08-23test: Ignore new libEGL warningsFlorian Bruhin
Seem to fail all tests on Archlinux-unstable
2024-08-22Add missing copyright / license headers to qutebrowser.qtFlorian Bruhin
Done via: reuse annotate \ --exclude-year \ -c 'Florian Bruhin (The Compiler) <mail@qutebrowser.org>' \ --license="GPL-3.0-or-later" \ qutebrowser/qt/*.py
2024-08-18Refer to mkvenv script by full path in install docstoofar
Might help with people copying and pasting commands. I don't think the script installs itself in bin/ in the virtualenv it creates? Closes: #8263
2024-08-13Reset PyInstaller environment on :restartFlorian Bruhin
Starting with PyInstaller 6.10 (6.9?), we're supposed to tell PyInstaller when we restart our application (and a subprocess should outlive this process). In their words: The above requirement was introduced in PyInstaller 6.9, which changed the way the bootloader treats a process spawned via the same executable as its parent process. Whereas previously the default assumption was that it is running a new instance of (the same) program, the new assumption is that the spawned process is some sort of a worker subprocess that can reuse the already-unpacked resources. This change was done because the worker-process scenarios are more common, and more difficult to explicitly accommodate across various multiprocessing frameworks and other code that spawns worker processes via sys.executable. https://pyinstaller.org/en/stable/common-issues-and-pitfalls.html#independent-subprocess https://pyinstaller.org/en/stable/CHANGES.html (6.10) While from a quick test on Windows, things still worked without setting the variable (possibly because we don't use a onefile build), it still seems reasonable to do what PyInstaller recommends doing. Follow-up to #8269.
2024-08-13Simplify type annotationFlorian Bruhin
See #8269
2024-08-13Merge pull request #8269 from qutebrowser/update-dependenciestoofar
Update dependencies
2024-08-12adjust babel changelog (case change)toofar
2024-08-12Add changelog URLstoofar
Few new vendored packages showing up from setuptools for environments where pkg_resources is being imported for some reasons. I don't think these requirements should be in our requirements files, they aren't direct dependancies and they aren't declared as dependancies of setuptools (and we are currently excluding setuptools from our requirements files anyway, although apparently that is not the right thing to do these days). These are actually not installed as normal packages by are vendored packages shipped with setuptools. Options I see to deal with them: 1. suck it up and add them to the compiled requirements files * not ideal, but should be harmless. They are real packages that the setuptools authors have chose to use 2. exclude these new packages using the markers in comments * maybe, seems like it could lead to issues in the future if any of these packages start getting declared as proper dependancies 3. find out where pkg_resources is being imported and stop it * I don't seem to be able to reproduce this behaviour locally, even when using a py3.8 docker container. And we are literally only running `pip freeze` via subprocess, what could the difference be? * I don't particularly want to delve into the arcane python packaging stuff, it seems to be layers and layers of very specific issues and old vendored packages 4. stop using pip freeze to compile requirements files and just compute them based off of the raw files themselves * Don't give us the chance to use stuff that we don't depend on but happens to be installed. We get other nice things with this too This commit does (1). I'll open a follow up PR to do (4).
2024-08-12mypy: adapt new type hints to pyqt5toofar
Ah! I'm having flashbacks to last year. 1. pyqt5 has plural enum names = define conditional type variable 2. pyqt5 doesn't wrap all the nullable things in Optional = sneakily make the existing overload function signature conditional. There might be some other was to solve this, not sure. I know we have qtutils.add_optional() but in this case it's complaining that the signature doesn't match the parent. Narrowing or widening the type of the returned object doesn't affect the function signature. Possibly we could define our own type variable MaybeOptional...
2024-08-12mypy: Attempt to extract base class from completion categoriestoofar
The methods in `completionmodel.py` dealing with completion categories were annotated with `QAbstractItemModel`. In mypy's latest 3.11 update it correctly pointed out that there is code relying on some attributes, like `name`, being on the categories but `QAbstractItemModel` didn't implement those attributes. This commit adds a new base class for completion categories which defines the extra attributes we expect. It also changes the type hints to ensure all list categories inherit from it. There is a couple of downsides to the current implementation: * It's using multiple inheritance * the completionmodel code currently expects categories to have all the methods of `QAbstractItemModel` plus a few other attributes. Each of the categories inherit from a different Qt model, so we can't just remove the Qt model from their class definition. * trying to extract the Qt models to a `widget` class is way too much work to fit in a dependency update, and I'm not sure it'll be the right thing to do because the categories are primarily Qt models, so we would have have to proxy most methods. Perhaps if they added their extra metadata to a central registry or something * I tried using a typing.Protocol for BaseCategory but when trying to make it also inherit from QAbstractItemModel it got grumpy at me * It doesn't enforce that the attributes are actually set * it makes mypy happy that they are there, but there is nothing warning child classes they have forgotten to set them. Mypy does at least warn about categories that don't inherit from `BaseCategory` so implementors will hopefully go there an look at it. * Apparently you can do some stuff with abstract properties, that might even have type hinting support. But that's a bit much for me to want to pile in there tonight At lest the type hints in `completionmodel.py` are more correct now!
2024-08-12Fix more type hintstoofar
New mypy 3.11 update got smarter and raise some issues, they appear to be correct in all cases. There are several `type: ignore[unreachable]` comments in conditionals on `sys.stderr` being None, which were introduce in a comment specifically to handle a case where `sys.stderr` could be None. So presumable the ignore comments were just to shut mypy up when it was making mistakes. In `debug.py` it was complaining that the class handling branch was unreachable, because the type hint was overly restrictive. We do indeed handle both classes and objects. `log.py` got some extra Optional annotations around a variable that isn't set if `sys.stderr` is None.
2024-08-12Remove `callback` arg to webkit print previewtoofar
mypy 1.11 has new and improved support for checking partial functions, and it works great! It says: qutebrowser/components/misccommands.py: note: In function "_print_preview": qutebrowser/components/misccommands.py:74: error: Unexpected keyword argument "callback" for "to_printer" of "AbstractPrinting" [call-arg] diag.paintRequested.connect(functools.partial( ^ qutebrowser/browser/browsertab.py:269: note: "to_printer" of "AbstractPrinting" defined here We indeed removed the callback arg in 377749c76f7080507dc64 And running `:print --preview` on webkit crashes with: TypeError: WebKitPrinting.to_printer() got an unexpected keyword argument 'callback' With this change print preview works again (on webkit), which I'm a little surprised by!
2024-08-12Adjust some type hints to better match parent classestoofar
mypy 1.11 has stricter checking of the type signature overridden methods: https://github.com/python/mypy/blob/master/CHANGELOG.md#stricter-checks-for-untyped-overrides There's a couple of places where I added type hints and had to duplicate the default kwarg value from the parent. In `completionmodel.py` it was complaining that the type signature of `parent()` didn't match that of `QAbstractItemModel` and `QObject`. I've changed it to be happy, and incidently made it so the positional arg is optional, otherwise it's impossible to call `QObject.parent()`. Options that I see: 1. support both variant of parent() - what I've done, the technically correct solution 2. have the two overload definitions but in the actual implementation make the positional argument required - would mean one overload signature was a lie, but would make it more clear how to call `CompletionModel.parent() 3. do type: ignore[override] and leave it as it was In the end I don't expect there to be many callers of `CompletionModel.parent(child)`. I also added a few more function type hints to `completionmodel.py` while I was there. Not all of them though! In `objreg.py` I expanded the user of `_IndexType` because as 7b9d70203fa say, the window register uses int as the key.
2024-08-12Update dependenciesqutebrowser bot
2024-08-04Fix crash when the renderer process terminates for an unknown reasonFlorian Bruhin
With PyQt 6, this gets represented as QWebEnginePage.RenderProcessTerminationStatus(-1) which is != -1, thus leading to a KeyError. Updating to a RendererProcessTerminationStatus enum value works fine on both PyQt5 and PyQt6.
2024-07-28Make failed subframe styling error matching more flexibletoofar
In the Qt6.8.0-beta2 release for some reason the error message now looks like; Failed to style frame: Failed to read a named property '_qutebrowser' from 'Window': Blocked a frame with origin "http://localhost:35549" from accessing a cross-origin frame. It seems to have an extra "Failed to read a named property '_qutebrowser' from 'Window'" before the "Blocked a frame ..." bit. Seems like maybe a nested exception situation? Not sure what's going on there but the exception is still being caught, which is the point of the test. Hopefully we don't have more issues with subframes cropping up...
2024-07-28Update pakjoy and chromium versions for Qt6.8toofar
Looks like the kde-unstable arch repo has updated again. It says 6.8.0beta2-1. I guess the number might change again in the future, still a couple of months to go before release.
2024-07-27update docs and changelog for URL match patterns linktoofar
2024-07-27Merge pull request #8268 from greenfoo/update_match_patterns_linktoofar
Update link to chrome match patterns documentation
2024-07-21Update link to match patterns documentationFernando Ramos
2024-07-15pakjoy: Fix Qt 5 testsFlorian Bruhin
2024-07-15scripts: Adjust PyQt[56]-sip package namesFlorian Bruhin
2024-07-15Update dependenciesqutebrowser bot
2024-07-14pakjoy: Test behavior when explicitly enabledFlorian Bruhin