adamholmberg has quit [Remote host closed the connection]
adamholmberg has joined #pypy
adamholmberg has quit [Ping timeout: 272 seconds]
Kipras_ has joined #pypy
<mjacob>
xorAxAx: ah, for the duesseldorf sprint. i have an exam on the 8th, so i’m not sure whether i’ll come. but in case i go to fosdem it’ll be convenient.
<mjacob>
(as duesseldorf is on the way back)
Kipras_ has quit [Read error: Connection reset by peer]
<Curi0>
Hi I'm having some problems with PyPy. I'm writing a crypto trading bot which requires fast calculations so I'm using pypy which reduces the calculations by 1/4th. But also I have to use websocket and the websocket library I'm using (https://github.com/lfern/ccxt/tree/feature/websockets-multiple) seems to be incompatible with PyPy. The example websocket-poloniex-orderbook.py gives error "AttributeError: 'poloniex' object has no
<Curi0>
attribute 'on'" with pypy3 while it works with the regular python3
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
dddddd has quit [Remote host closed the connection]
<vstinner>
mattip: (i added your links to my page, will be online shortly)
xcm has joined #pypy
rubdos has quit [Ping timeout: 252 seconds]
antocuni has joined #pypy
<xorAxAx>
mjacob, indeed
<energizer>
where's the implementation of functools.reduce?
nimaje has quit [Quit: WeeChat 2.3]
nimaje has joined #pypy
<njs>
vstinner: the history is that I organized the meeting, the idea should be credited to the group really, everyone was enthusiastic, but it was the week before I got really sick and so no-one has taken the initiative to follow up
<vstinner>
njs: "no-one has taken the initiative to follow up" that's ok
<njs>
vstinner: just saying that's the status -- it's a significant project, someone would need to spearhead it, no-one has yet
<vstinner>
njs: would it be doable in Cython?
<njs>
I use the word "cython-like" a lot... there are some details of cython that don't work (in particular, you can do things like put C code into your .pyx file that gets passed directly to the C compiler... obviously that doesn't work if you want to compile the same code to a non-C target)
<njs>
but yeah, a slightly tweaked subset of cython or something
<fijal>
the point is it really can be a very cython like thing
<fijal>
where you can compile nearly all cython or something (with some restrictions)
<fijal>
one of the key restrictions is that cython cannot call C API directly
<mattip>
we could already modify cython to have a python backend and call C via cffi
<fijal>
except it does not work
<fijal>
and does not produce what we want either
<fijal>
ronan thought about it for a bit
<mattip>
do you know if he looked at memoryview for interaction with cffi?
<fijal>
I don't know
<fijal>
but it's a) hard to replicate C things with just cffi (how do you pass them as arguments?) and b) sort of defeats the point
<fijal>
it's not clear to me it would be faster that way than just calling cpyext these days
<fijal>
for anything that's worth writing in cython
<fijal>
and also only ever beneficial to pypy, which is not a great start
<mattip>
mm, not necessarily only pypy. There is alot of use of cython to interop with numpy
<mattip>
to quickly loop over numpy values
<mattip>
so hand off a buffer to a c function
<antocuni>
fijal: don't understimate how slow cpyext is :)
<mattip>
but I should try and find out why ronan viewed it as a no go
<arigato>
...Cython to cffi transpiler: that's probably not a great idea, and it defeats the point, which is of some form of custom IR that *can* be compiled to real C
<arigato>
not a great idea because there are details that don't fit, like the GIL
<arigato>
and everybody knows that the GIL is a "detail" that is not a detail
<antocuni>
arigato: I think a better approach would be to have a "cython-like" language which emits an sufficiently high-level IR
<arigato>
yes, agreed
<antocuni>
and then you can compile this IR to C, or interpret/JIT it directly
<mattip>
+1
<mattip>
but it needs an ffi story
<arigato>
right, but that's not the whole thing, e.g. for numpy you need to make subclasses of basic python types, all the mess
<mattip>
I am already rewriting dtypes (after participating in the rewrite for numpypy)
<mattip>
and contemplating if I can do some of it in pure python
<mattip>
but numpy is so much more than just the ufuncs/memory model/dtypes,
<antocuni>
we should also consider whether we want an INTERPRETER for this language, considering how awful it is to debug cython compared to python/rpython
<antocuni>
I mean, an interpreter in addition to a compiler
<arigato>
probably a good idea too, if it can be made to behave as much like the compiler does
<arigato>
as possible
<cfbolz>
Don't we kind of need an interpreter, to jit it? Or would the goal to be to create jit codes
<arigato>
I can imagine two things, a C compiler for CPython, and an interpreter-ish for everything else (debugging, and pypy execution)
<arigato>
but indeed I don't know, maybe creating jit codes would be possible
<cfbolz>
Right
<antocuni>
I suppose it depends on how much you want to be fast when non-jitted; you can do rpython-like and create C + jitcodes, or python-like and write an interpreter
<cfbolz>
Only downside could be that the warmup would potentially suck for PyPy
Zaab1t has quit [Quit: bye bye friends]
<cfbolz>
But given that right now we just stay slow with cpyext, maybe that's ok
<antocuni>
well, I still think that it's possible to make cpyext fastish; but of course nothing comparable to being able to JIT inside it
<cfbolz>
Right
speeder39_ has joined #pypy
jcea has quit [Read error: Connection reset by peer]
<mattip>
strange that they just assume 64bits would have sse2
<arigato>
no, that's expected
<arigato>
all x86-64 CPUs have at least sse2
<mattip>
I seem to recall some discussion about not supporting cpus without sse2
<arigato>
pypy? yes, pypy doesn't support them
<mattip>
so should we add CFLAGS for -msse2 for _blake2 building, unconditionally?
<arigato>
yes, sounds safe
jcea has quit [Ping timeout: 250 seconds]
<arigato>
I'm not sure why we don't *always* provide -msse2 on x86-32 builds from distutils, but I guess the answer is "because we didn't think about that"
jacob22__ has quit [Ping timeout: 245 seconds]
jcea has joined #pypy
<mattip>
ok, added a extra_compile_args to the set_source in _blake2_build.py, but what to do for windows?
marky1991 has quit [Remote host closed the connection]
marky1991 has joined #pypy
marky1991 has quit [Read error: Connection reset by peer]
marky1991 has joined #pypy
themsay has joined #pypy
hugo_ has joined #pypy
hugo_ has quit [Client Quit]
themsay has quit [Ping timeout: 246 seconds]
hugo_ has joined #pypy
hugo_ has quit [Remote host closed the connection]
hugomg has joined #pypy
hugomg has quit [Client Quit]
hugomg has joined #pypy
jcea1 has joined #pypy
<hugomg>
Hello. I keep hearing that PyPy reimplements some standard library modules in pure Python instead of C, for increased performance. But I was wondering: are any of these implemented in RPython instead of Python, by any chance?
jcea has quit [Ping timeout: 250 seconds]
jcea1 is now known as jcea
<simpson>
hugomg: There are two ways to take this idea. One way, the way you've seen it, you're correct; small bits of the stdlib are written in RPython, and that probably does have an effect on speed.
adamholmberg has quit [Read error: Connection reset by peer]
<simpson>
The other way, though, is worth pointing out: PyPy can't see into C code, but it can see into Python code. So PyPy has no way to make C calls faster, but it does have that opportunity when the stdlib is in Python.
adamholmberg has joined #pypy
<hugomg>
Does RPython code count as "C code" in this explanation?
<antocuni>
hugomg: yes, all modules in pypy/modules are implemented in RPython
<simpson>
No. RPython code is usually visible to the JIT. When PyPy makes syscalls, the RPython parts of the syscall are visible to the JIT and the libc parts are not.
<antocuni>
the modules in lib_pypy are at applevel (i.e., "pure python")
<hugomg>
I understand that PyPy can't optimize C stuff, but I was wondering why I haven't heard more about optimizing stuff by writing it in RPython. Could be just me being ignorant.
Rhy0lite has quit [Quit: Leaving]
<mattip>
we did rewrite much of numpy in RPython
<hugomg>
(btw, thanks for the pointers, Antonio. Looking at those as we speak)
<mattip>
but the ecosystem is just to big to keep up with it
<hugomg>
ah, I now have managed to find a bit in the documentation saying that the pypy extension modules are more suited for internal stuff as opposed to third party modules. Is this because exposing this API to 3rd party modules would be problematic (due to fiddling with JIT internals), or just because there wasn't a big enough need for it to justify to work?
<antocuni>
hugomg: the biggest issue is that pypy doesn't support separate compilation
<antocuni>
so rpython modules need to be compiled together with the interpreter
<mattip>
a sse, a neon, and a fallback ref implementation but we only took the sse 2
<mattip>
s/2/one/
<hugomg>
so, the reason I have been asking these questions is that for my PHD research (on Lua) I've been working on a similar space, but more rpython-y and less JIT-y. While I am at it, could I also ask about your experience on what is the most important reason for rewriting stuff in Python/RPython? Is it more about reducing overhead while manipulating Python datastructures or more about providing optimization opportunities for the tracing JIT?
<antocuni>
instead of C? Definitely the fact that the JIT can trace into it
<antocuni>
see in the logs the recent discussions about having a cython-like language which produces an IR which is traceable by the JIT
* antocuni
off, bye
<hugomg>
Thanks, I'll certainly have a look there.
antocuni has quit [Ping timeout: 246 seconds]
themsay has joined #pypy
themsay has quit [Read error: Connection reset by peer]