<felix34>
Every once in a while my program needs to shut down and call `exotic.py` which has some exotic dependencies that do not work with PyPy. Is there any way to have `exotic.py` run in CPython (and a different virtualenv) while the main program runs in PyPy?
lritter has joined #pypy
jcea has joined #pypy
Zaab1t has joined #pypy
solarjoe4 has quit [Quit: Leaving]
<antocuni>
felix34: generally speaking, not. You might try with execnet, which helps to communicate between two different processes
mattip has quit [Ping timeout: 246 seconds]
<felix34>
antocuni: Thanks, I'll take a look. To be clear, the scenario is: 1) `main.py` finishes computation, saves to disk 2) `exotic.py` reads file from disk 3) `exotic.py` processes data thru `exotic-quantum-wormhole-package` 4) `exotic.py` saves result to disk 5) `main.py` reads result, starts new computation
<LarstiQ>
felix34: that could also be solved by two running processes that notice when files get written (say inotify) and reload
<simpson>
This sounds like a straightforward `subprocess` call of some sort.
<LarstiQ>
or that
<felix34>
I'm not sure about subprocess calls (docs anywhere?) but LarstiQ's idea makes sense
<felix34>
python2 only for pypy? or are the py3 docs OK? (you linked py2)
<simpson>
I'm thinking about your question as if you wanted to call not CPython, but a JVM or Perl. I think that it's easy to confuse PyPy and CPython as having more overlap than that, but they don't.
<simpson>
You can use Python 3. You didn't specify a version, so I assumed Python 2.
<felix34>
simpson: aha, looking at the subprocess docs that makes sense. Is there any way to have the subprocess run outside the current virtualenv? (since `exotic-quantum-wormhole-package` won't install in the pypy3 virtualenv)
<simpson>
felix34: Run the `python` executable installed inside the virtualenv, so that you don't have to mess around with any environment variables.
<antocuni>
activating a virtualenv is just a convenience when using the bash prompt; to execute a script inside a virtualenv, it's enough to lauch it's bin/python, no activation needed
<felix34>
simpson: oh, there is a non-pypy python in there? and if I run it, then it will use the systemwide packages as opposed to the pypy ones?
<simpson>
In the other virtualenv.
jacob22__ has joined #pypy
<felix34>
simpson: so from PyPy I call (via subprocess) `~/cpython-virtenv/bin/python ~/exotic.py`?
marky1991 has quit [Read error: Connection reset by peer]
marky1991 has joined #pypy
marky1991 has quit [Read error: Connection reset by peer]
<simpson>
felix34: Something like that, yeah.
abrown has joined #pypy
<abrown>
simpson: are you on?
<felix34>
simpson: thanks!
<simpson>
abrown: Somewhat. What's up?
<abrown>
I've changed up my interpreter so that a lot of the gc_load_* in the JIT logs are using _immutable_fields_; they show up as pure in the logs but I just want them to go away entirely... i.e. I want to tell it that the locations in tracing don't change and that it can just use constant pointers for the trace; I feel like we talked at some point about how you did this in Monte but I can't recall--were you able to get rid of all or most of your gc_load_* there?
<simpson>
Are they green?
<abrown>
the first AST node in the chain of nodes is green; the other nodes in the chain are accessed through fields in _immutable_fields_
<abrown>
but the chain could be long
<simpson>
The number of indirections shouldn't matter, I think? The trace should see that all of them are constant.
<abrown>
yeah... that's what I thought; I need a way to ask the JIT, "why don't you think these field accesses are constant?"
<simpson>
rpython.rlib.jit.assert_green()
<abrown>
hm, let me look at that
<abrown>
yeah, that looks pretty helpful, thanks! however, if the assertions fail at runtime I will still be left wondering why: maybe I'm using _immutable_fields_ wrong?
<simpson>
IIUC these are compile-time assertions.
<abrown>
I thought _immutable_fields_ could be changed up until meta-interpretation starts so I have a pass after parsing that changes some indexes on the nodes--does this somehow defeat the immutability declaration and the system just continues quietly?
<simpson>
Oh, huh, assert_green can happen at runtime. TIL.
<simpson>
Hm. Normally you can only assign those once, period. Are you using quasi-immuts?
<abrown>
no, but I guess I could change the _immutable_fields_ for those fields to be quasi-immutable--that seems more correct, I guess
<abrown>
i.e. after I set them in that second pass they never change again
<abrown>
simpson: thanks, I'll try the assert_green and quasi-immutable annotation out and see what happens; ttyl
forgottenone has quit [Remote host closed the connection]
forgottenone has joined #pypy
Zaab1t has quit [Ping timeout: 268 seconds]
<cfbolz>
abrown: assert_green is a pretty rare API
<cfbolz>
I would look at the trace to check where the variables that you read from come from
<simpson>
Yeah, the trace should show the ultimate cause of the impurity.
mattip has joined #pypy
Zaab1t has joined #pypy
forgottenone has quit [Ping timeout: 250 seconds]
rradjabi has joined #pypy
<rradjabi>
Hello, I'm looking for help using cffi for a very beginner example. I have a trivial library in demo.h and demo.c that defines trivial_add(int a, int b), and I have compiled this into libdemo.so. How do I use cffi to generate a python wrapper and call this from python?
<rradjabi>
@ronan, i'm trying that now. I'm missing pyconfig.h.
<rradjabi>
Do I need a devel package of python perhaps?
<ronan>
yes, something like python-dev
<rradjabi>
Okay, i'm struggling to get those installed on my CentOS system. Let me try to get that working.
<rradjabi>
@ronan, i have python27-python-devel.x86_64 installed, but i'm still not able to find the headers
<rradjabi>
I managed to find /usr/include/python2.7/pyconfig-64.h
<rradjabi>
Okay I need to figure this out, then I suppose it'll fall into place. Thank you @ronan.
<ronan>
rradjabi: no worries, I can't really help with CentOS, but as long as you can compile any C extensions from source, you should be able to use cffi
antocuni has quit [Ping timeout: 268 seconds]
Zaab1t has quit [Quit: bye bye friends]
nunatak has joined #pypy
nunatak has quit [Quit: Leaving]
<rradjabi>
@ronan, I have a working Python environment now.
<rradjabi>
So I have compiled my shared object with 'gcc -shared -fPIC -Wall demo.c -o libdemo.so'
<arigato>
how well do you know JavaScript? {5} evaluates to 5; {5}+1 evaluates to 1; 1+{5} is a syntax error
<rradjabi>
my _build.py file declares libraries=['demo'] and include "demo.h". But gcc fails to find the library. gcc -pthread -shared -L/opt/rh/rh-python36/root/usr/lib64-Wl,-z,relro -Wl,-rpath,/opt/rh/rh-python36/root/usr/lib64 -Wl,--enable-new-dtags ./_demo_cffi.o -L/opt/rh/rh-python36/root/usr/lib64 -ldemo -lpython3.6m -o ./_demo_cffi.cpython-36m-x86_64-linux-gnu.so /usr/bin/ld: cannot find -ldemo collect2: error: ld returned 1 exit status
<cfbolz>
arigato: argh, what?
<cfbolz>
Why?
<ronan>
rradjabi: you need to set library_dirs as well
<rradjabi>
Ultimately, i'll need to use my compiled library instead of cffi building.
<rradjabi>
@ronan, I see, I added library_dirs=['./'] and that compiled.
<rradjabi>
@ronan, I got my example working. Thank you!
RemoteFox has quit [Remote host closed the connection]
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
<rradjabi>
@ronan, I'm sorry to bother you, but I'm trying to reproduce my steps and I'm running into a new problem. When I try importing the library in python with "from _demo_cffi import ffi, lib" I get the error that libdemo.so cannot be found/opened
<rradjabi>
Previously I was able to run this command and call into the library with 'print(lib.trivial_add(2,3))'
<rradjabi>
ls
<arigato>
rradjabi: when the .so is not installed on the system but is instead (e.g.) in the current directory, then it isn't found by default
<arigato>
there are several workarounds, the simplest is to run with LD_LIBRARY_PATH=.
<rradjabi>
@arigato, It was found because I added library_dirs=['./'] and it succeeded.
<arigato>
that's two different things
<rradjabi>
Really?
<arigato>
library_dirs turns into "gcc -L dir", which means it can be found during compilation
<rradjabi>
right
<arigato>
but then you need another trick for finding it at runtime
<rradjabi>
then what does python need to find the library?
<arigato>
it's not a python issue, it's standard Linux (or your platform)
<rradjabi>
I got it to work earlier, but I don't think I did any trick.
<rradjabi>
okay i need to add my working dir to LD_LIBRARY_PATH?
<arigato>
yes
<arigato>
or else play around with the gcc option '-Wl,-rpath=$ORIGIN/' if you're on Linux (it's completely platform-specific)