cfbolz changed the topic of #pypy to: PyPy, the flexible snake (IRC logs: https://botbot.me/freenode/pypy/ ) | use cffi for calling C | "the modern world where network packets and compiler optimizations are effectively hostile"
TheAdversary has quit [Ping timeout: 240 seconds]
marr has quit [Ping timeout: 260 seconds]
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 255 seconds]
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 248 seconds]
jwhisnant has quit [Ping timeout: 248 seconds]
asmeurer_ has joined #pypy
wleslie has quit [Quit: ~~~ Crash in JIT!]
jwhisnant has joined #pypy
tbodt has joined #pypy
_whitelogger has joined #pypy
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 240 seconds]
ArneBab has joined #pypy
ArneBab_ has quit [Ping timeout: 255 seconds]
tbodt has quit [Read error: Connection reset by peer]
tbodt has joined #pypy
jcea has quit [Remote host closed the connection]
tbodt has quit [Read error: Connection reset by peer]
tbodt has joined #pypy
rokujyouhitoma has joined #pypy
pilne has quit [Quit: Quitting!]
yuyichao_ has joined #pypy
rokujyouhitoma has quit [Ping timeout: 248 seconds]
rokujyouhitoma has joined #pypy
Taggnostr has joined #pypy
Taggnostr2 has quit [Ping timeout: 240 seconds]
rokujyouhitoma has quit [Ping timeout: 248 seconds]
tbodt has quit [Quit: My Mac has gone to sleep. ZZZzzz…]
<njs> fijal: heh, so nick coghlan just suggested that cpython start using cython for parts of the stdlib
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 255 seconds]
oberstet has joined #pypy
<fijal> njs: he does not like cffi?
<njs> fijal: the idea is instead of rewriting stuff from python to C, rewrite/recompile it using cython, for faster startup etc.
<njs> fijal: so I don't think cffi is really the alternative
<njs> fijal: it's somewhat interesting in the context of potentially trying to migrate numpy etc. to cython
<fijal> oh
<fijal> so he does not like python, sorry
<njs> well, cpython rewrites stuff in C all the time
<fijal> I suppose that's nothing new
<dash> nobody loves python more than pypy devs
<fijal> Yeah maybe making it easier for them is a bad idea ;)
<fijal> njs: note that it's not like we have a good cython story yet, but I see your point
<fijal> and devil is in details
<njs> fijal: oh sure
<fijal> but this idea has been floating for a while no?
<njs> not sure; I haven't seen it go by on python-* before, but I don't claim to encyclopedic knowledge of python core development discussions
<njs> thank gods
realitix has joined #pypy
inad922 has joined #pypy
rokujyouhitoma has joined #pypy
TheAdversary has joined #pypy
rokujyouhitoma has quit [Ping timeout: 258 seconds]
realitix has quit [Ping timeout: 268 seconds]
<fijal> njs: I've seen this sort of stuff popping up every now and again and then being bogged down in the trenches
<kenaan> arigo cffi/cffi f928bdbf5e1f /: Issue #300 Hopefully fix the remaining cases where a _Bool return value was not correctly converted to a Python ...
arigato has joined #pypy
asmeurer_ has quit [Quit: asmeurer_]
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 240 seconds]
realitix has joined #pypy
marky1991 has quit [Ping timeout: 276 seconds]
forgottenone has quit [Quit: Konversation terminated!]
forgottenone has joined #pypy
cstratak has joined #pypy
vkirilichev has joined #pypy
<kenaan> arigo default 2d76a6b9d1c2 /: import cffi/f928bdbf5e1f
antocuni has joined #pypy
marr has joined #pypy
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 255 seconds]
Taggnostr2 has joined #pypy
Taggnostr has quit [Ping timeout: 240 seconds]
inad922 has quit [Ping timeout: 260 seconds]
cstratak has quit [Ping timeout: 260 seconds]
cstratak has joined #pypy
Taggnostr has joined #pypy
Taggnostr2 has quit [Ping timeout: 260 seconds]
magniff has joined #pypy
<magniff> good day, guys
<arigato> hi
ronan has joined #pypy
ronan has quit [Client Quit]
ronan has joined #pypy
inad922 has joined #pypy
ronan__ has joined #pypy
ronan has quit [Read error: Connection reset by peer]
ronan__ is now known as ronan
ronan has quit [Ping timeout: 260 seconds]
rokujyouhitoma has joined #pypy
ronan has joined #pypy
rokujyouhitoma has quit [Ping timeout: 268 seconds]
Osca has joined #pypy
ronan has quit [Ping timeout: 260 seconds]
ronan has joined #pypy
kipras`away is now known as kipras
Taggnostr has quit [Ping timeout: 260 seconds]
Taggnostr has joined #pypy
kipras is now known as kipras`away
<Osca> Hello. I'm currently playing with embedding pypy into c++ application. I was going to ask if there are more examples on using "CFFI's native embedding support" but then I found cppyy project which looks like even better option.
<Osca> I have however some problems understanding how cppyy can be used in my case, all examples seem to address extending python by c++ code, not communicating between "host" c++ and python scripts. Are there some projects / examples that I could use as guideline?
<arigato> not sure, but I think cppyy doesn't support embedding at all
<fijal> but you can embed using cffi and use cppyy to do actual calls right?
<arigato> probably, yes
<arigato> unsure if the gradual move of some parts of cppyy towards cffi makes that easier or more confusing
<arigato> also, maybe at some point we could use genreflex on some large C headers (not actually C++ but doesn't matter) and emit a cffi build script
<Osca> Ok, I think I understand it now. I need to check if using cppyy + cffi will be better option than using cffi alone. But this brings back my original question.
<Osca> Are there any examples on using "CFFI's native embedding support" with pypy?
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 255 seconds]
catalinif has quit [Quit: Page closed]
rmariano has quit [Ping timeout: 260 seconds]
Taggnostr2 has joined #pypy
Taggnostr has quit [Ping timeout: 240 seconds]
<arigato> it works the same way with pypy or cpython
<arigato> there might be some path issues, but basically it picks up whichever python interpreter you ran the _build.py file with
antocuni has quit [Ping timeout: 248 seconds]
<Osca> Ok, I think I will be able to make something out of it? Thank you a lot! :)
<Osca> s/?/. ;p
<arigato> :-)
nimaje is now known as Guest32687
nimaje1 has joined #pypy
nimaje1 is now known as nimaje
Guest32687 has quit [Killed (weber.freenode.net (Nickname regained by services))]
<kenaan> rlamy multiphase bbea9a4a0c49 /pypy/module/cpyext/test/test_module.py: fix test name (was shadowing the existing 'test_basic')
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 246 seconds]
antocuni has joined #pypy
squeaky has joined #pypy
squeaky is now known as Guest22287
Guest22287 has quit [Client Quit]
squeaky_pl has joined #pypy
<squeaky_pl> antocuni, if you have any suggestions or questions about manylinux1 besides what I've responded in the issue feel free to ask me
<antocuni> squeaky_pl: thanks, I was not aware of these problems for centos 5
<squeaky_pl> antocuni, You are the 3rd person asking about manylinux1, with enough sweat it can be done
<squeaky_pl> It seems like nobody has time to act on manylinux2, maybe I shoould do that instead
<antocuni> I don't really know how manylinux* is managed; who decides when/how manylinuxX is defined?
<squeaky_pl> So certainly it takes to write a new PEP, along the way move the Docker images then wait for some acceptance and finally for pip to start accepting manylinux2 wheels
<squeaky_pl> njs, can tell you everything about it
<antocuni> so certainly not something which can happen quickly :(
<squeaky_pl> njs, what is better, push manylinux2 or sweat a little more and build PyPy on Centos5
<antocuni> I suppose that for the time being, I can just build linux wheels that are not manylinux1 and hope they still works "most of the time"
<antocuni> btw, the fact that centos 5 has reached EOL, isn't it a problem for manylinux1 as well?
<squeaky_pl> antocuni, yeah, you can hardcode repos
<squeaky_pl> But it will be kind of sad that you can build PyPy2 wheels and not PyPy3?
<antocuni> yes :(
<squeaky_pl> Is there a way we can make PyPy3 translation work on Centos5?
<antocuni> no idea. What was the original problem?
marky1991 has joined #pypy
<squeaky_pl> FD_CLOEXEC
<squeaky_pl> It's this change that Python 3 made that fthe files you open have this set by default
<antocuni> uh, but then how do they manage to build cpython 3 on centos 5?
<squeaky_pl> So when you fork you don't inherit file descriptors beside stdout and stderr
<squeaky_pl> I think there is something in the build of CPython 3 that PyPy doest do
<antocuni> so in theory we could do the same of cpython 3, cross fingers and hope that it's enough?
<squeaky_pl> It fails early because it searches that include file with FD_CLOEXEC, Maybe I could just update the kernel somehow there...
<squeaky_pl> I mean the kernel is not used by Docker anyway, it's just the constant which is missing
<arigato> I guess it's a matter of copying more of the #ifdefs from CPython
<squeaky_pl> If we can make PyPy3 work on Centos 5 I will be more than glad to move back to Centos 5
<squeaky_pl> If it enables manylinux1 PyPy wheels
rmariano has joined #pypy
<arigato> CPython definitely has a forest of #ifdefs that I did not copy in all details
<antocuni> note that I don't know if this is *enough* to get manylinux1 wheels on pypy, but it is certainly required
<antocuni> so it's worth trying, IMHO
<squeaky_pl> antocuni, BTW if you grab PyPy2 5.7.1 it works in manylinux1 docker
<squeaky_pl> it;s the last version I built against Centos 5
<kenaan> rlamy default a8c055058298 /pypy/module/cpyext/test/test_cpyext.py: kill dead code
<antocuni> yes, but then I'm not sure that e.g. numpy and scipy compile well
mattip has joined #pypy
<squeaky_pl> well it does on CPython...
<antocuni> plus, I wonder what happens if we compile numpy/scipy with pypy 5.7.1 and import them on pypy 5.8
<antocuni> our binary ABI tag is always pypy_41, but I fear it will just break :(
<squeaky_pl> Yeah, well if PyPy wants to support manylinux there should be a policy about not breaking ABI
<antocuni> yes, I know, we should enforce it better from now on
<antocuni> what I mean is that it's likely that we did NOT enforce it in the past
<squeaky_pl> So maybe a first step would be to look into PyPy3 on Centos 5
<squeaky_pl> And then when this passes think about ABI
<squeaky_pl> And then saying we support manylinux from the next verson of PyPy?
<squeaky_pl> Is that plan feasible?
<antocuni> maybe
realitix has quit [Ping timeout: 255 seconds]
<antocuni> arigato, mattip: do you have opinions about this?
<arigato> no, sorry
* mattip reading logs
<squeaky_pl> I am certinaly sure manylinux is attractive and it would help PyPy acceptance
<antocuni> yes, me too
<mattip> pypy cpyext IMO is not stable enough to support manylinux
<mattip> , both from an API (headers and macros vs. functions) and an API (broken function calls)
<squeaky_pl> Ah ok, so I think that sinks the whole idea
<mattip> but maybe from PyPy 5.9 things will be better?
<mattip> certainly NumPy HEAD will not build against anything older than 5.8
<antocuni> mattip: I'm not sure to understand what you mean; the point of manylinux1 is basically to ensure that a binary can be used under many different linuxes distros
<antocuni> in particular, it is NOT that the same wheel works against different pypy versions
<squeaky_pl> manylinux cares about ABI, so that would be about not changing the signatures of existing exposed cpyext things and not jumping between functions and macros? Or my thinking is too simplistic
<antocuni> in other words, you would need to build a different wheel for each different pypy ABI tag
<squeaky_pl> I mean we can break the ABI but we should bump the ABI numer?
<antocuni> now, currently we have the problem that we are not always bumping the ABI tag accordingly
<squeaky_pl> Maybe a test can be written that detects ABI breakage?
<antocuni> but this is easily solvable by uncommenting pypy/module/imp/importing.py:40
<mattip> would we then split cffi/cpyext ABI numbering? That seems like a loss
<squeaky_pl> well definitely you don't want to break ABI too much, certainly not in minor version
<squeaky_pl> I mean micro *
rokujyouhitoma has joined #pypy
<squeaky_pl> I would say breaking ABI, and increasing it each 6 months is not a big deal
<squeaky_pl> CPython does that every 18 months
<antocuni> mattip: I think the current situation is worse; e.g., you compile a package on pypy-5.7.1 you get a .so; if you try to import this .so on pypy 5.8, it doesn't work
<antocuni> although the ABI tag is the same so it SHOULD work
jcea has joined #pypy
rokujyouhitoma has quit [Ping timeout: 240 seconds]
<arigato> I agree with the general idea, which probably means splitting the extension for cpyext and the extension for cffi (which should *not* change as often)
<mattip> antocuni: true
<antocuni> arigato: this will means that cffi modules will have a different ABI tag than cpyext modules?
<arigato> yes
<antocuni> I wonder if "same interpreter with two different ABIs" is supported by the existing tools
<arigato> I'm ok if the next pypy release breaks the compatibility with cffi modules, because it kind of should: there's a corner case that changed
raynold has quit [Quit: Connection closed for inactivity]
<arigato> ah, meh
<antocuni> what?
<arigato> what you said
<antocuni> maybe dstufft or njs know better?
<arigato> maybe we can use the "stable ABI" for cffi modules on pypy too?
<arigato> no clue how that would work with pypy2
<antocuni> what is the "stable ABI"?
* antocuni googles
<mattip> squeaky_pl: where is the original "issue" that you were responding to when you joined today?
<arigato> note that extension modules for pypy for cffi should *almost* work without recompiling across pypy2 and pypy3
<squeaky_pl> mattip, the follow up on this is that I am willing to go back on Centos 5 given things can be fixed on PyPy side with the ultimate goal of having manylinux work on PyPy
<arigato> almost but not quite, because of the compatibility with cpyext extension modules...
<arigato> could easily be fixed, if that sounds like a good idea
<arigato> I'm just unsure that existing tools will agree to consider a "stable ABI" package for pypy2
<antocuni> arigato: it's probably a good idea, if the tools can or will be able to handle it
<squeaky_pl> So that existing tools would be pip and setuptools?
<antocuni> yes, and the wheel package I suppose
<kenaan> rlamy default a22659423f20 /pypy/module/cpyext/test/: cleanup cleanup code
<squeaky_pl> You can always fix them "directly in PyPy"
<squeaky_pl> You already do that
ronan has quit [Quit: Ex-Chat]
<squeaky_pl> (like the virtualenv .so fix)
<squeaky_pl> But surely it would be better if it was working outside.
<antocuni> yes, I'd keep "fix it in pypy" as a last resort strategy
marky1991 has quit [Ping timeout: 240 seconds]
<squeaky_pl> OK so the missing piece of puzzle is an input from those tool authors
<mattip> sorry if I am off on a tangent, tell me to go away, but when I look at https://pypi.python.org/pypi/numpy I see ABI cp27, cp34, cp35 ...
<mattip> if we start changing our ABI every two months that will quickly grow to an unmanagable number of release packages, no?
<mattip> IMO, better to declare "we are not stable yet, please do not upload cpyext packges to PyPI"
<antocuni> mattip: this is not only about uploading things to pypi
<squeaky_pl> mattip, does that really happen that much that you change existing function signatures in cpyext? or you just add new ones and fix the old ones without breaking ABI
<antocuni> e.g., people could want to build their own wheel to avoid recompiling everything every time they create a virtualenv
<squeaky_pl> or might be you juggle a lot with the structure layout I dont know certianly
<nanonyme> mattip, not just that, the author might also refuse to build PyPy packages if it doubles the build matrix
<mattip> antocuni: and they would want to cache those wheels while supporting different pypy versions in the same dev environment?
<mattip> antocuni: since AFAICT the wheels are cached when you do pip install ..., no?
<antocuni> mattip: that's true. However I saw real world use cases in which this is not enough; e.g. because you are doing pip install from within docker, which means that the cache is erased every time you rebuild the image
<antocuni> for such use cases, an external pypi index works well
<squeaky_pl> antocuni, mount the pip cache outside Docker
<antocuni> yes, I know, but that's not the point imho
<antocuni> there ARE legitimate use cases of using a private pypi index with binary wheels
<antocuni> it would be nice if pypy can support that
<mattip> antocuni: agreed. But that should be a local index managed by a group who all use the same PyPy version
<squeaky_pl> Yes, I did use a private package index with binary wheels before manylinux was a thing
<mattip> antocuni: or at least can manage the versioning issues without needing a new multilinux API name
<squeaky_pl> I am with mattip on that, if we cannot ship to PyPI it's not worth it
<squeaky_pl> Setting up a private index is already enough pain for those companies to figure out things themselves
aboudreault_ has quit [Quit: Ex-Chat]
realitix has joined #pypy
<squeaky_pl> But definitely the ABI tag should be fixed
<mattip> squeaky_pl: maybe we don't break ABI only add, but I don't want to commit to that just yet
<squeaky_pl> Since you could get wheels from that private package index that don't work in the newer PyPy
<squeaky_pl> mattip, it's totally understandable since the new incarnation of cpyext is pretty young
<antocuni> mattip: sorry, I think we are talking about two different things at a time here. The ABI tag needs to be fixed anyway: currently, we build extensions with the pypy_41 tag but certainly they are not compatible with any pypy from 4.1 onwards
<antocuni> then we can also talk about whether to do manylinux1 or not, but it's a different thing
<squeaky_pl> antocuni, I agree about that part with you
<antocuni> mattip: the easiest thing to do is simply to use the pypy version number for the tag, as we used to be. Then, when we are stable enough we can think about fixing it again
<antocuni> I'm not sure why we fixed it to 41, honestly
<arigato> I think because previously it was "26" and that the time of "41" we did a major obvious update that required recompiling
<antocuni> ok
<mattip> ... and it is not part of the 'howto release page' http://doc.pypy.org/en/latest/how-to-release.html
<mattip> so I never did it :(
<squeaky_pl> Instead of guessing if the ABI was broken can this be automated?
<mattip> nanonyme: unfortunately doubling the build matrix for projects that use cython/CAPI would seem inevitable
<antocuni> I create an issue about the ABI tag, so at least we don't forget
<nanonyme> mattip, I'd say that's a very strong argument for not using cython/CAPI in any projects that aren't part of CPython
<nanonyme> But bleh
<mattip> nanonyme: I'm convinced :)
<nanonyme> mattip, though it's too bad CFFI itself needs bazillion packages
<mattip> squeaky_pl: api-laboratory-pro looks cool, I wonder how they would deal with generated headers?
<squeaky_pl> this is the generator of JSON
<arigato> nanonyme: does "two on cpython and zero on pypy" count as a bazillion
<squeaky_pl> it could be rewritten to fit PyPys goals
<squeaky_pl> And you would use the tool itself to visualize it
<mattip> hmm, we could add it to the buildbot steps
<nanonyme> arigato, according to my count 7 per Python2 release for cpython, 5 per Python3 release for cpython
<nanonyme> So, what, 14+20=34 builds?
<arigato> ah sorry, I thought you meant dependencies
<nanonyme> No, I meant the build matrix for CFFI itself
<nanonyme> Granted, not my package to worry about but the count still sounds reasonably high to me and getting bigger
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 248 seconds]
squeaky_pl has quit [Quit: Leaving]
squeaky_pl has joined #pypy
Rhy0lite has joined #pypy
<mattip> in any case, I think it makes sense to bump SOABI to 59 for the next release
<antocuni> +1
<kenaan> mattip default fcd9df9494da /pypy/doc/how-to-release.rst: document need to consider changing SOABI when releasing
<arigato> mattip: -1, unless it comes with a story for the cffi release
<arigato> sorry, I mean of course the cffi-based extension modules
fryguybob has quit [Quit: leaving]
<arigato> but note that I'm fine if my "-1" is overridden by others
<arigato> so really, -0.5
<antocuni> arigato: from my POV, keeping the ABI tag the same between versions is just an optimization (because it saves you from recompilation); declaring it's the same while it's not is a wrong behavior
forgottenone has quit [Ping timeout: 255 seconds]
<antocuni> so, I prefer to disable the optimization than to keep it and be wrong
<arigato> yes, likely
<antocuni> that said, having an updated ABI tag + a better cffi story is the best of both worlds
<arigato> it creates additional burden for binary pypi cffi extensions for pypy, but indeed I'm not sure that exists so far
<antocuni> I think they cannot even exists
<antocuni> because the only binary linux wheels you can upload to pypi are manylinux1
<mattip> arigato: even if this is a one-time change? You said something about "there's a corner case that changed" above
<arigato> right, -0.4 then
yuyichao_ has quit [Ping timeout: 260 seconds]
fryguybob has joined #pypy
Remi_M has quit [Quit: See you!]
<mattip> searching for a pypi package released with the pypy41 tag comes up empty for me, "site:pypi.org pypy41"
<mattip> ut it probably is my google foo
<mattip> but
<LarstiQ> potentially there might be private users
<antocuni> mattip: AFAIK, pypi forbids binary linux wheels, unless they are manylinux1
<antocuni> and since it is apparently impossible to build a manylinux1 pypy wheel, I guess it's not a surprise that there are no wheels
marr has quit [Ping timeout: 240 seconds]
<mattip> ahh, the canonical pypy name is pp, so that should be pp41 anyway
<mattip> ... which still comes up empty, as antocuni said, for good reason
forgottenone has joined #pypy
marky1991 has joined #pypy
<mattip> so I guess I am +1, maybe we should ask pypy-dev
<mattip> 5.8 was released Jun 9, so hopefully this will all be settled by end of Aug for 5.9
rokujyouhitoma has joined #pypy
<antocuni> of course they are not manylinux1 wheel, so they probably only work on recent versions of ubuntu or similar enough distros
<antocuni> but it's still nice to be able to install numpy in few seconds
<antocuni> (scipy is building right now)
<tos9> I haven't followed this discussion but I've got quite a few pypy wheels in our devpi
<tos9> Should I be reading the discussion :P?
<LarstiQ> tos9: evidence of my private user claim \o/
<antocuni> tos9: how do you manage multiple versions of pypy? Do you build a wheel for each of those?
<tos9> antocuni: Yes
rokujyouhitoma has quit [Ping timeout: 276 seconds]
<tos9> (Though I try to minimize how many versions we use as well.)
<antocuni> and I suppose that each time we release a new version of pypy, you rebuild the wheels?
<tos9> antocuni: Yep.
<tos9> I wish that was automated, but it's not unfortunately, I just upload them by hand.
<antocuni> so I think that nobody will change for you
<tos9> Cool.
yuyichao_ has joined #pypy
<antocuni> one of the thing we discussed is the possibility of building manylinux1 wheels, but it seems hard for now
<tos9> That would certainly be nice.
magniff has quit [Ping timeout: 260 seconds]
oberstet has quit [Ping timeout: 268 seconds]
<mattip> antocuni: have you tried using other packages on top of that numpy?
<antocuni> not yet
<antocuni> do you foresee a problem?
<mattip> no, just fishing for feedback
<mattip> pandas still has some test failures, just wondering if anyone is using it yet
<mattip> and there is a wierd performance issue where arange(100000) is good but array(range(100000)) eats memory
<antocuni> mattip: I tried to use pandas but got bitten by the "refcheck=True" issue
<mattip> ahh, should be fixed on a nightly, no?
<mattip> pandas HEAD, numpy HEAD
<antocuni> ah, didn't know. I tried "pypy -m pip install pandas", so probably I didn't get the fix
Ryanar has joined #pypy
<mattip> ok, you probably know this but just in case, here is my progress https://bitbucket.org/pypy/pypy/wiki/cpyext_2_-_cython%20and%20pandas
* mattip off, later
mattip has left #pypy ["bye"]
<LarstiQ> hmm, wonder what the difference between `a = np.array(range(5000))` and `a=np.arange(5000)` is that it would behave so differently
<LarstiQ> np.array assumption on range? Pypy bug?
inad922 has quit [Ping timeout: 240 seconds]
kolko has quit [Read error: Connection reset by peer]
inad922 has joined #pypy
kolko has joined #pypy
vkirilichev has quit [Remote host closed the connection]
inad922 has quit [Ping timeout: 240 seconds]
<kenaan> arigo cffi/cffi 2f3c1c595e96 /doc/source/embedding.rst: Mention the embedding problem on Debian that was discussed in issue #264.
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 268 seconds]
mvantellingen has quit [Ping timeout: 240 seconds]
mvantellingen has joined #pypy
blachance has quit [Remote host closed the connection]
vkirilichev has joined #pypy
<mjacob> rmariano: i already implemented aclose()/close() (but one case missing - marked as a TODO - where i didn't come up with a test case right in that moment
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 276 seconds]
<rmariano> mjacob: I just noticed, I'll take a look at it now :)
<rmariano> Maybe I can add that test case
asmeurer_ has joined #pypy
vkirilichev has quit [Remote host closed the connection]
jcea has quit [Quit: jcea]
antocuni has quit [Ping timeout: 248 seconds]
arigato has quit [Quit: Leaving]
asmeurer_ has quit [Quit: asmeurer_]
squeaky_pl has quit [Remote host closed the connection]
rokujyouhitoma has joined #pypy
jcea has joined #pypy
ronan has joined #pypy
rokujyouhitoma has quit [Ping timeout: 248 seconds]
raynold has joined #pypy
<bbot2> Started: http://buildbot.pypy.org/builders/own-linux-x86-64/builds/6085 [mjacob: force build, py3.6]
marr has joined #pypy
lritter has joined #pypy
rmariano has quit [Ping timeout: 260 seconds]
Ryanar has quit [Quit: Ryanar]
vkirilichev has joined #pypy
Ryanar has joined #pypy
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 268 seconds]
mvantellingen has quit [Ping timeout: 240 seconds]
rmariano has joined #pypy
mvantellingen has joined #pypy
Ryanar has quit [Quit: Ryanar]
dmalcolm has quit [Remote host closed the connection]
ronan has quit [Ping timeout: 248 seconds]
vkirilichev has quit [Remote host closed the connection]
cstratak has quit [Quit: Leaving]
vkirilichev has joined #pypy
<bbot2> Failure: http://buildbot.pypy.org/builders/own-linux-x86-64/builds/6085 [mjacob: force build, py3.6]
dmalcolm has joined #pypy
vkirilichev has quit [Remote host closed the connection]
Ryanar has joined #pypy
rokujyouhitoma has joined #pypy
rokujyouhitoma has quit [Ping timeout: 240 seconds]
rmariano has quit [Ping timeout: 240 seconds]
rmariano has joined #pypy
rokujyouhitoma has joined #pypy
ronan has joined #pypy
rokujyouhitoma has quit [Ping timeout: 258 seconds]
ronan_ has joined #pypy
ronan has quit [Ping timeout: 240 seconds]
ronan_ is now known as ronan
vkirilichev has joined #pypy
blachance has joined #pypy
rmariano has quit [Ping timeout: 276 seconds]
vkirilichev has quit [Remote host closed the connection]
<dash> hmm. any idea how much bitrot the emscripten backend has? tempted to try building monte for emscripten
<fijal> someone is trying to make a business out of a profiler
<nanonyme> Sure, why not
<dash> other people have, for javascript
<dash> not an unreasonable idea
<dash> good luck i guess
yuyichao_ has quit [Ping timeout: 240 seconds]
yuyichao_ has joined #pypy
<fijal> dash: I think we decided it's not worth it
<fijal> maybe prematurely
kipras`away is now known as kipras
<dash> fijal: I still regard it as a bit of a party trick, but who knows
<fijal> profiling?
<fijal> dash:
<dash> oh, you meant vmprof-as-a-business
<fijal> so what do you regard as a party trick?
<dash> emscripten backend for rpython ;)
<fijal> ah, yes :)
realitix has quit [Ping timeout: 240 seconds]
oberstet has joined #pypy
rmariano has joined #pypy
Rhy0lite has quit [Quit: Leaving]
jcea has quit [Quit: jcea]
rmariano has quit [Ping timeout: 240 seconds]
asmeurer__ has joined #pypy
<kenaan> rlamy cpyext-leakchecking e5f5f5b6191c /pypy/module/cpyext/test/test_cpyext.py: Try to enable leakchecker in cpyext tests; pre-create some PyObjects for long-lived objects
<pjenvey> uhm was the point of this old_style_finalizer to be temporary while we migrated to finalizerqueues?
* pjenvey not sure why it's still around
ronan has quit [Quit: Ex-Chat]
<nimaje> I currently don't remember the quote right, but something like "stuff that is supposed to be temporary will stay forever"
<dash> nimaje: "nothing so permanent as a temporary fix"
ronan has joined #pypy
<ronan> pjenvey: what do you mean by old_style_finalizer?
rokujyouhitoma has joined #pypy
marky1991 has quit [Read error: Connection reset by peer]
rokujyouhitoma has quit [Ping timeout: 260 seconds]
<nimaje> dash: yes, thanks. thats what I meant
rmariano has joined #pypy
<pjenvey> ronan: sorry, left that as an obscure comment mostly for arigo, looking at issue #2590
<mjacob> ronan: the longer you work on multiphase initalization, the more i realize that claiming in one commit message to "Implement main part of PEP 489" was an overstatement
<pjenvey> https://bitbucket.org/pypy/pypy/commits/6b9a7ecbd6ad (the old_style_finalizer stuff in here)
<mjacob> ronan: i still hope that my work works well as a basis
<ronan> mjacob: I haven't really been doing anything related to PEP 489
<mjacob> ronan: ah, i only saw that you occasionally commit to the multiphase branch
<mjacob> BTW, what's the plan with py3.5? how much of the milestone 3 (this is the last, right) is done?
<ronan> I'm mostly dealing with random issues that come from trying to get _testmultiphase to work
<ronan> mjacob: there are 4 milestones, for milestone 3 we've decided to just try to fix all the things
asmeurer__ has quit [Quit: asmeurer__]
<mjacob> ah, milestone 4 is performance
<ronan> yes
<mjacob> fixing all the tests and corner cases is a pain and i can understand that the progress is a bit slower in the moment
asmeurer__ has joined #pypy
<mjacob> from now on, i'll have a bit more time for contributing and i didn't really decide yet whether i'll focus on py3.5 or py3.6 (the latter is more motivating of course...)
<pjenvey> mjacob: hurray!
<mjacob> pjenvey: if you also intensify for efforts again, we will have a Python 3.7-compatible release before CPython ;)
asmeurer__ has quit [Quit: asmeurer__]
<kenaan> mjacob py3.6 75d795b23931 /lib-python/3/sysconfig.py: Add temporary "solution" for failing 'import site'.
<pjenvey> or amaury =]
<kenaan> mjacob py3.6 c42f02b742b9 /lib-python/3/ctypes/__init__.py: Re-apply part of 46bb03e8.
asmeurer_ has joined #pypy
kolko has quit [Ping timeout: 240 seconds]
rokujyouhitoma has joined #pypy
rmariano has quit [Ping timeout: 268 seconds]
rokujyouhitoma has quit [Ping timeout: 260 seconds]
ronan has quit [Ping timeout: 260 seconds]