cfbolz changed the topic of #pypy to: PyPy, the flexible snake (IRC logs: https://botbot.me/freenode/pypy/ ) | use cffi for calling C | "the modern world where network packets and compiler optimizations are effectively hostile"
<njs>
fijal: the idea is instead of rewriting stuff from python to C, rewrite/recompile it using cython, for faster startup etc.
<njs>
fijal: so I don't think cffi is really the alternative
<njs>
fijal: it's somewhat interesting in the context of potentially trying to migrate numpy etc. to cython
<fijal>
oh
<fijal>
so he does not like python, sorry
<njs>
well, cpython rewrites stuff in C all the time
<fijal>
I suppose that's nothing new
<dash>
nobody loves python more than pypy devs
<fijal>
Yeah maybe making it easier for them is a bad idea ;)
<fijal>
njs: note that it's not like we have a good cython story yet, but I see your point
<fijal>
and devil is in details
<njs>
fijal: oh sure
<fijal>
but this idea has been floating for a while no?
<njs>
not sure; I haven't seen it go by on python-* before, but I don't claim to encyclopedic knowledge of python core development discussions
<njs>
thank gods
realitix has joined #pypy
inad922 has joined #pypy
rokujyouhitoma has joined #pypy
TheAdversary has joined #pypy
rokujyouhitoma has quit [Ping timeout: 258 seconds]
realitix has quit [Ping timeout: 268 seconds]
<fijal>
njs: I've seen this sort of stuff popping up every now and again and then being bogged down in the trenches
<kenaan>
arigo cffi/cffi f928bdbf5e1f /: Issue #300 Hopefully fix the remaining cases where a _Bool return value was not correctly converted to a Python ...
arigato has joined #pypy
asmeurer_ has quit [Quit: asmeurer_]
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 240 seconds]
realitix has joined #pypy
marky1991 has quit [Ping timeout: 276 seconds]
forgottenone has quit [Quit: Konversation terminated!]
ronan has quit [Read error: Connection reset by peer]
ronan__ is now known as ronan
ronan has quit [Ping timeout: 260 seconds]
rokujyouhitoma has joined #pypy
ronan has joined #pypy
rokujyouhitoma has quit [Ping timeout: 268 seconds]
Osca has joined #pypy
ronan has quit [Ping timeout: 260 seconds]
ronan has joined #pypy
kipras`away is now known as kipras
Taggnostr has quit [Ping timeout: 260 seconds]
Taggnostr has joined #pypy
kipras is now known as kipras`away
<Osca>
Hello. I'm currently playing with embedding pypy into c++ application. I was going to ask if there are more examples on using "CFFI's native embedding support" but then I found cppyy project which looks like even better option.
<Osca>
I have however some problems understanding how cppyy can be used in my case, all examples seem to address extending python by c++ code, not communicating between "host" c++ and python scripts. Are there some projects / examples that I could use as guideline?
<arigato>
not sure, but I think cppyy doesn't support embedding at all
<fijal>
but you can embed using cffi and use cppyy to do actual calls right?
<arigato>
probably, yes
<arigato>
unsure if the gradual move of some parts of cppyy towards cffi makes that easier or more confusing
<arigato>
also, maybe at some point we could use genreflex on some large C headers (not actually C++ but doesn't matter) and emit a cffi build script
<Osca>
Ok, I think I understand it now. I need to check if using cppyy + cffi will be better option than using cffi alone. But this brings back my original question.
<Osca>
Are there any examples on using "CFFI's native embedding support" with pypy?
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 255 seconds]
<squeaky_pl>
antocuni, if you have any suggestions or questions about manylinux1 besides what I've responded in the issue feel free to ask me
<antocuni>
squeaky_pl: thanks, I was not aware of these problems for centos 5
<squeaky_pl>
antocuni, You are the 3rd person asking about manylinux1, with enough sweat it can be done
<squeaky_pl>
It seems like nobody has time to act on manylinux2, maybe I shoould do that instead
<antocuni>
I don't really know how manylinux* is managed; who decides when/how manylinuxX is defined?
<squeaky_pl>
So certainly it takes to write a new PEP, along the way move the Docker images then wait for some acceptance and finally for pip to start accepting manylinux2 wheels
<squeaky_pl>
njs, can tell you everything about it
<antocuni>
so certainly not something which can happen quickly :(
<squeaky_pl>
njs, what is better, push manylinux2 or sweat a little more and build PyPy on Centos5
<antocuni>
I suppose that for the time being, I can just build linux wheels that are not manylinux1 and hope they still works "most of the time"
<antocuni>
btw, the fact that centos 5 has reached EOL, isn't it a problem for manylinux1 as well?
<squeaky_pl>
antocuni, yeah, you can hardcode repos
<squeaky_pl>
But it will be kind of sad that you can build PyPy2 wheels and not PyPy3?
<antocuni>
yes :(
<squeaky_pl>
Is there a way we can make PyPy3 translation work on Centos5?
<antocuni>
no idea. What was the original problem?
marky1991 has joined #pypy
<squeaky_pl>
FD_CLOEXEC
<squeaky_pl>
It's this change that Python 3 made that fthe files you open have this set by default
<antocuni>
uh, but then how do they manage to build cpython 3 on centos 5?
<squeaky_pl>
So when you fork you don't inherit file descriptors beside stdout and stderr
<squeaky_pl>
I think there is something in the build of CPython 3 that PyPy doest do
<antocuni>
so in theory we could do the same of cpython 3, cross fingers and hope that it's enough?
<squeaky_pl>
It fails early because it searches that include file with FD_CLOEXEC, Maybe I could just update the kernel somehow there...
<squeaky_pl>
I mean the kernel is not used by Docker anyway, it's just the constant which is missing
<arigato>
I guess it's a matter of copying more of the #ifdefs from CPython
<squeaky_pl>
If we can make PyPy3 work on Centos 5 I will be more than glad to move back to Centos 5
<squeaky_pl>
If it enables manylinux1 PyPy wheels
rmariano has joined #pypy
<arigato>
CPython definitely has a forest of #ifdefs that I did not copy in all details
<antocuni>
note that I don't know if this is *enough* to get manylinux1 wheels on pypy, but it is certainly required
<antocuni>
so it's worth trying, IMHO
<squeaky_pl>
antocuni, BTW if you grab PyPy2 5.7.1 it works in manylinux1 docker
<squeaky_pl>
it;s the last version I built against Centos 5
<kenaan>
rlamy default a8c055058298 /pypy/module/cpyext/test/test_cpyext.py: kill dead code
<antocuni>
yes, but then I'm not sure that e.g. numpy and scipy compile well
mattip has joined #pypy
<squeaky_pl>
well it does on CPython...
<antocuni>
plus, I wonder what happens if we compile numpy/scipy with pypy 5.7.1 and import them on pypy 5.8
<antocuni>
our binary ABI tag is always pypy_41, but I fear it will just break :(
<squeaky_pl>
Yeah, well if PyPy wants to support manylinux there should be a policy about not breaking ABI
<antocuni>
yes, I know, we should enforce it better from now on
<antocuni>
what I mean is that it's likely that we did NOT enforce it in the past
<squeaky_pl>
So maybe a first step would be to look into PyPy3 on Centos 5
<squeaky_pl>
And then when this passes think about ABI
<squeaky_pl>
And then saying we support manylinux from the next verson of PyPy?
<squeaky_pl>
Is that plan feasible?
<antocuni>
maybe
realitix has quit [Ping timeout: 255 seconds]
<antocuni>
arigato, mattip: do you have opinions about this?
<arigato>
no, sorry
* mattip
reading logs
<squeaky_pl>
I am certinaly sure manylinux is attractive and it would help PyPy acceptance
<antocuni>
yes, me too
<mattip>
pypy cpyext IMO is not stable enough to support manylinux
<mattip>
, both from an API (headers and macros vs. functions) and an API (broken function calls)
<squeaky_pl>
Ah ok, so I think that sinks the whole idea
<mattip>
but maybe from PyPy 5.9 things will be better?
<mattip>
certainly NumPy HEAD will not build against anything older than 5.8
<antocuni>
mattip: I'm not sure to understand what you mean; the point of manylinux1 is basically to ensure that a binary can be used under many different linuxes distros
<antocuni>
in particular, it is NOT that the same wheel works against different pypy versions
<squeaky_pl>
manylinux cares about ABI, so that would be about not changing the signatures of existing exposed cpyext things and not jumping between functions and macros? Or my thinking is too simplistic
<antocuni>
in other words, you would need to build a different wheel for each different pypy ABI tag
<squeaky_pl>
I mean we can break the ABI but we should bump the ABI numer?
<antocuni>
now, currently we have the problem that we are not always bumping the ABI tag accordingly
<squeaky_pl>
Maybe a test can be written that detects ABI breakage?
<antocuni>
but this is easily solvable by uncommenting pypy/module/imp/importing.py:40
<mattip>
would we then split cffi/cpyext ABI numbering? That seems like a loss
<squeaky_pl>
well definitely you don't want to break ABI too much, certainly not in minor version
<squeaky_pl>
I mean micro *
rokujyouhitoma has joined #pypy
<squeaky_pl>
I would say breaking ABI, and increasing it each 6 months is not a big deal
<squeaky_pl>
CPython does that every 18 months
<antocuni>
mattip: I think the current situation is worse; e.g., you compile a package on pypy-5.7.1 you get a .so; if you try to import this .so on pypy 5.8, it doesn't work
<antocuni>
although the ABI tag is the same so it SHOULD work
jcea has joined #pypy
rokujyouhitoma has quit [Ping timeout: 240 seconds]
<arigato>
I agree with the general idea, which probably means splitting the extension for cpyext and the extension for cffi (which should *not* change as often)
<mattip>
antocuni: true
<antocuni>
arigato: this will means that cffi modules will have a different ABI tag than cpyext modules?
<arigato>
yes
<antocuni>
I wonder if "same interpreter with two different ABIs" is supported by the existing tools
<arigato>
I'm ok if the next pypy release breaks the compatibility with cffi modules, because it kind of should: there's a corner case that changed
raynold has quit [Quit: Connection closed for inactivity]
<arigato>
ah, meh
<antocuni>
what?
<arigato>
what you said
<antocuni>
maybe dstufft or njs know better?
<arigato>
maybe we can use the "stable ABI" for cffi modules on pypy too?
<arigato>
no clue how that would work with pypy2
<antocuni>
what is the "stable ABI"?
* antocuni
googles
<mattip>
squeaky_pl: where is the original "issue" that you were responding to when you joined today?
<arigato>
note that extension modules for pypy for cffi should *almost* work without recompiling across pypy2 and pypy3
<squeaky_pl>
mattip, the follow up on this is that I am willing to go back on Centos 5 given things can be fixed on PyPy side with the ultimate goal of having manylinux work on PyPy
<arigato>
almost but not quite, because of the compatibility with cpyext extension modules...
<arigato>
could easily be fixed, if that sounds like a good idea
<arigato>
I'm just unsure that existing tools will agree to consider a "stable ABI" package for pypy2
<antocuni>
arigato: it's probably a good idea, if the tools can or will be able to handle it
<squeaky_pl>
You can always fix them "directly in PyPy"
<squeaky_pl>
You already do that
ronan has quit [Quit: Ex-Chat]
<squeaky_pl>
(like the virtualenv .so fix)
<squeaky_pl>
But surely it would be better if it was working outside.
<antocuni>
yes, I'd keep "fix it in pypy" as a last resort strategy
marky1991 has quit [Ping timeout: 240 seconds]
<squeaky_pl>
OK so the missing piece of puzzle is an input from those tool authors
<mattip>
sorry if I am off on a tangent, tell me to go away, but when I look at https://pypi.python.org/pypi/numpy I see ABI cp27, cp34, cp35 ...
<mattip>
if we start changing our ABI every two months that will quickly grow to an unmanagable number of release packages, no?
<mattip>
IMO, better to declare "we are not stable yet, please do not upload cpyext packges to PyPI"
<antocuni>
mattip: this is not only about uploading things to pypi
<squeaky_pl>
mattip, does that really happen that much that you change existing function signatures in cpyext? or you just add new ones and fix the old ones without breaking ABI
<antocuni>
e.g., people could want to build their own wheel to avoid recompiling everything every time they create a virtualenv
<squeaky_pl>
or might be you juggle a lot with the structure layout I dont know certianly
<nanonyme>
mattip, not just that, the author might also refuse to build PyPy packages if it doubles the build matrix
<mattip>
antocuni: and they would want to cache those wheels while supporting different pypy versions in the same dev environment?
<mattip>
antocuni: since AFAICT the wheels are cached when you do pip install ..., no?
<antocuni>
mattip: that's true. However I saw real world use cases in which this is not enough; e.g. because you are doing pip install from within docker, which means that the cache is erased every time you rebuild the image
<antocuni>
for such use cases, an external pypi index works well
<squeaky_pl>
antocuni, mount the pip cache outside Docker
<antocuni>
yes, I know, but that's not the point imho
<antocuni>
there ARE legitimate use cases of using a private pypi index with binary wheels
<antocuni>
it would be nice if pypy can support that
<mattip>
antocuni: agreed. But that should be a local index managed by a group who all use the same PyPy version
<squeaky_pl>
Yes, I did use a private package index with binary wheels before manylinux was a thing
<mattip>
antocuni: or at least can manage the versioning issues without needing a new multilinux API name
<squeaky_pl>
I am with mattip on that, if we cannot ship to PyPI it's not worth it
<squeaky_pl>
Setting up a private index is already enough pain for those companies to figure out things themselves
aboudreault_ has quit [Quit: Ex-Chat]
realitix has joined #pypy
<squeaky_pl>
But definitely the ABI tag should be fixed
<mattip>
squeaky_pl: maybe we don't break ABI only add, but I don't want to commit to that just yet
<squeaky_pl>
Since you could get wheels from that private package index that don't work in the newer PyPy
<squeaky_pl>
mattip, it's totally understandable since the new incarnation of cpyext is pretty young
<antocuni>
mattip: sorry, I think we are talking about two different things at a time here. The ABI tag needs to be fixed anyway: currently, we build extensions with the pypy_41 tag but certainly they are not compatible with any pypy from 4.1 onwards
<antocuni>
then we can also talk about whether to do manylinux1 or not, but it's a different thing
<squeaky_pl>
antocuni, I agree about that part with you
<antocuni>
mattip: the easiest thing to do is simply to use the pypy version number for the tag, as we used to be. Then, when we are stable enough we can think about fixing it again
<antocuni>
I'm not sure why we fixed it to 41, honestly
<arigato>
I think because previously it was "26" and that the time of "41" we did a major obvious update that required recompiling
<arigato>
nanonyme: does "two on cpython and zero on pypy" count as a bazillion
<squeaky_pl>
it could be rewritten to fit PyPys goals
<squeaky_pl>
And you would use the tool itself to visualize it
<mattip>
hmm, we could add it to the buildbot steps
<nanonyme>
arigato, according to my count 7 per Python2 release for cpython, 5 per Python3 release for cpython
<nanonyme>
So, what, 14+20=34 builds?
<arigato>
ah sorry, I thought you meant dependencies
<nanonyme>
No, I meant the build matrix for CFFI itself
<nanonyme>
Granted, not my package to worry about but the count still sounds reasonably high to me and getting bigger
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 248 seconds]
squeaky_pl has quit [Quit: Leaving]
squeaky_pl has joined #pypy
Rhy0lite has joined #pypy
<mattip>
in any case, I think it makes sense to bump SOABI to 59 for the next release
<antocuni>
+1
<kenaan>
mattip default fcd9df9494da /pypy/doc/how-to-release.rst: document need to consider changing SOABI when releasing
<arigato>
mattip: -1, unless it comes with a story for the cffi release
<arigato>
sorry, I mean of course the cffi-based extension modules
fryguybob has quit [Quit: leaving]
<arigato>
but note that I'm fine if my "-1" is overridden by others
<arigato>
so really, -0.5
<antocuni>
arigato: from my POV, keeping the ABI tag the same between versions is just an optimization (because it saves you from recompilation); declaring it's the same while it's not is a wrong behavior
forgottenone has quit [Ping timeout: 255 seconds]
<antocuni>
so, I prefer to disable the optimization than to keep it and be wrong
<arigato>
yes, likely
<antocuni>
that said, having an updated ABI tag + a better cffi story is the best of both worlds
<arigato>
it creates additional burden for binary pypi cffi extensions for pypy, but indeed I'm not sure that exists so far
<antocuni>
I think they cannot even exists
<antocuni>
because the only binary linux wheels you can upload to pypi are manylinux1
<mattip>
arigato: even if this is a one-time change? You said something about "there's a corner case that changed" above
<arigato>
right, -0.4 then
yuyichao_ has quit [Ping timeout: 260 seconds]
fryguybob has joined #pypy
Remi_M has quit [Quit: See you!]
<mattip>
searching for a pypi package released with the pypy41 tag comes up empty for me, "site:pypi.org pypy41"
<mattip>
ut it probably is my google foo
<mattip>
but
<LarstiQ>
potentially there might be private users
<antocuni>
mattip: AFAIK, pypi forbids binary linux wheels, unless they are manylinux1
<antocuni>
and since it is apparently impossible to build a manylinux1 pypy wheel, I guess it's not a surprise that there are no wheels
marr has quit [Ping timeout: 240 seconds]
<mattip>
ahh, the canonical pypy name is pp, so that should be pp41 anyway
<LarstiQ>
hmm, wonder what the difference between `a = np.array(range(5000))` and `a=np.arange(5000)` is that it would behave so differently
<LarstiQ>
np.array assumption on range? Pypy bug?
inad922 has quit [Ping timeout: 240 seconds]
kolko has quit [Read error: Connection reset by peer]
inad922 has joined #pypy
kolko has joined #pypy
vkirilichev has quit [Remote host closed the connection]
inad922 has quit [Ping timeout: 240 seconds]
<kenaan>
arigo cffi/cffi 2f3c1c595e96 /doc/source/embedding.rst: Mention the embedding problem on Debian that was discussed in issue #264.
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 268 seconds]
mvantellingen has quit [Ping timeout: 240 seconds]
mvantellingen has joined #pypy
blachance has quit [Remote host closed the connection]
vkirilichev has joined #pypy
<mjacob>
rmariano: i already implemented aclose()/close() (but one case missing - marked as a TODO - where i didn't come up with a test case right in that moment
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 276 seconds]
<rmariano>
mjacob: I just noticed, I'll take a look at it now :)
<kenaan>
rlamy cpyext-leakchecking e5f5f5b6191c /pypy/module/cpyext/test/test_cpyext.py: Try to enable leakchecker in cpyext tests; pre-create some PyObjects for long-lived objects
<pjenvey>
uhm was the point of this old_style_finalizer to be temporary while we migrated to finalizerqueues?
* pjenvey
not sure why it's still around
ronan has quit [Quit: Ex-Chat]
<nimaje>
I currently don't remember the quote right, but something like "stuff that is supposed to be temporary will stay forever"
<dash>
nimaje: "nothing so permanent as a temporary fix"
ronan has joined #pypy
<ronan>
pjenvey: what do you mean by old_style_finalizer?
rokujyouhitoma has joined #pypy
marky1991 has quit [Read error: Connection reset by peer]
rokujyouhitoma has quit [Ping timeout: 260 seconds]
<nimaje>
dash: yes, thanks. thats what I meant
rmariano has joined #pypy
<pjenvey>
ronan: sorry, left that as an obscure comment mostly for arigo, looking at issue #2590
<mjacob>
ronan: the longer you work on multiphase initalization, the more i realize that claiming in one commit message to "Implement main part of PEP 489" was an overstatement
<mjacob>
ronan: i still hope that my work works well as a basis
<ronan>
mjacob: I haven't really been doing anything related to PEP 489
<mjacob>
ronan: ah, i only saw that you occasionally commit to the multiphase branch
<mjacob>
BTW, what's the plan with py3.5? how much of the milestone 3 (this is the last, right) is done?
<ronan>
I'm mostly dealing with random issues that come from trying to get _testmultiphase to work
<ronan>
mjacob: there are 4 milestones, for milestone 3 we've decided to just try to fix all the things
asmeurer__ has quit [Quit: asmeurer__]
<mjacob>
ah, milestone 4 is performance
<ronan>
yes
<mjacob>
fixing all the tests and corner cases is a pain and i can understand that the progress is a bit slower in the moment
asmeurer__ has joined #pypy
<mjacob>
from now on, i'll have a bit more time for contributing and i didn't really decide yet whether i'll focus on py3.5 or py3.6 (the latter is more motivating of course...)
<pjenvey>
mjacob: hurray!
<mjacob>
pjenvey: if you also intensify for efforts again, we will have a Python 3.7-compatible release before CPython ;)