abravalheri on v63.0.0b1
packages = find_packages("src/engine"),
package_dir={"":"src/engine"},
should do that, but I'm still seeing src/engine in the paths when I tar tvfz my sdist package. What am I doing wrong?
src/engine
from the sdist doesn’t really make sense. An sdist is essentially a copy of the source, prepared for distribution. It’s still expected that the setup.py
will be run against the expanded source distribution, meaning that if your sdist were to remove the src/engine
, the setup.py that uses package_dir={“”: “src/engine”}
would no longer be valid. Why not remove src/engine
, find_packages
, and package_dir
and just have ./MyPackage
in your source?
setuptools
? I've got an existing project that uses setup.py
that I currently don't want to upgrade to whatever the latest flavour–of–the–month config format. But I'd like to understand what options I can pass to setup(...)
, yet I can't find the docs anywhere.
More DDG-ing found it: https://packaging.python.org/guides/distributing-packages-using-setuptools/#setup-args
Why isn't this documented on the setuptools project anywhere?
req==3
and req==4
(two different, conflicting versions), they’re both considered requirements of the package, and it’s probably up to the installer (pip
or other) to determine the meaning of that declaration. So the way it’s written there is perhaps the best thing you have right now - a list of requirements all of which are mutually exclusive (only one active in a given environment).
setup.cfg
. I know about extras_require
but ideally I want a way to tell pip that one and only one of the frameworks needs to be installed. (Sorry if it's the wrong place to ask, I posted on r/learnpython but no one responded yet).
yourlib[charm]
or yourlib[ray]
. If you’ll only ever have the two dependencies, you could nominate one as primary and indicate it as regular dependency and indicate the other as the extra, and if the extra is present, defer to it.
provides
in it's PKGBUILD
s... well it's not critical and I'm glad I wasn't missing something obvious!
pyproject.toml
with a ´[tool.setuptools.setup_cfg_data]` map and/or a multi-line string value of it (with the promise that the multi-line string will be certainly compatible and the map/nested map will approach compatible)
Howdy! I'm far from an expert on packaging and elf binaries, but I wanted to get people's thoughts on issues facing libraries using protobuf - e.g. grpc/grpc#24897, protocolbuffers/protobuf#8291 . The core of the issue appears to be Protobuf's Python library statically links in libprotobuf (https://github.com/protocolbuffers/protobuf/blob/master/python/setup.py#L188-L190) . In turn, when another library like tensorflow or gRPC also links in libprotobuf we get symbol conflicts and pain. To my knowledge this is pretty similar to the numeric stack's situation (e.g. numpy & scipy both use and ship libopenblas) which doesn't have these issues - but importantly instead of relying on static linking they ship .dylib
or .so
files patched up with auditwheel
or delocate
.
What would your suggestions be here as folks that are more expert in packaging? Is the numpy way the blessed way?
Thanks so much!
setuptools.setup_cfg_data
, I’m not sure the context, but I think you’re asking about supporting pyproject.toml
as another format to solicit config for setuptools. I’m not opposed to supporting pyproject.toml
, but I’d really like not to have too many ways to do a thing… and setup.cfg
is just now reaching maturity. I’d definitely like to avoid setup_cfg
in the name. It should be meaningful keys like tools.setuptools.metadata
or tools.setuptools.options
(mapping to metadata
and options
in setup.cfg).
Hiya, I have a setup.py that is currently using setuptools to build a C++ python extension (which in-turn is part of a conda recipe). I'd need to ship some C++ header files with this package, but I can't seem to get it to work. I have added the headers into my setup, like this:
from setuptools import setup
setup(..., headers=[str(fn) for fn in Path("include").glob("**/*") if fn.is_file()])
However, when I install the package using python setup.py install
it creates an egg file without installing these headers.
I've read on various forums how to install a package that includes headers but nothing that I have tried works. For example, I've tried adding recursive-include include *.hpp
to MANIFEST.in but this makes no difference to what gets installed. I've also tried using distutils.core.setup
instead of setuptools.setup
but then my pybind11 extension does not build. I've tried using pip install .
but then my data_files not longer get packaged and my unit-tests failed.
The setuptools documenting is a bit sparse when it comes to packaging headers files and a lot of the information in online forums is very old. I can see that setuptools does have a install_headers
command, can someone point me to a example of how this works?
Thanks!
headers
directive. I’d suggest to look into the distutils/setuptools source to see what it uses that value for.
--global-option
disables the use of wheels:$ pip install . --global-option='-j4'
/home/tom/python/dev/lib/python3.9/site-packages/pip/_internal/commands/install.py:245: UserWarning: Disabling all use of wheels due to the use of --build-option / --global-option / --install-option.
—global-option
but I’m not confident in the implementation.