ChanServ changed the topic of #picolisp to: PicoLisp language | Channel Log: https://irclog.whitequark.org/picolisp/ | Picolisp latest found at http://www.software-lab.de/down.html | check also http://www.picolisp.com for more information
cilz has quit [Ping timeout: 246 seconds]
jibanes has quit [Ping timeout: 264 seconds]
jibanes has joined #picolisp
shpx has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
shpx has joined #picolisp
libertas has quit [Ping timeout: 260 seconds]
libertas has joined #picolisp
orivej has joined #picolisp
dtornabene has joined #picolisp
aw- has joined #picolisp
rob_w has joined #picolisp
orivej has quit [Ping timeout: 240 seconds]
cilz has joined #picolisp
<cilz> good morning guys
<Regenaxer> Hi cilz!
<cilz> hello Regenaxer
<cilz> do you have some time for a morning question?
<Regenaxer> Sure
<cilz> ok
<cilz> I was trying (uniq ([D[DI was trying this last night
<cilz> (apply + (uniq (append (range 3 999999 3) (range 5 999999 5))))
<cilz> and PIL exited hardly. any reason?
<Regenaxer> The list is too long for 'apply'
<Regenaxer> You must set the stack limit to "unlimited"
<cilz> how can I do that?
<Regenaxer> With ulimit
<cilz> ok, will do that now and let you know, thanks
<Regenaxer> In general it is not recommended to do such big applys
<Regenaxer> better use 'sum' or other mappings
<Regenaxer> (sum prog (range ...
<Regenaxer> or even better a 'for' loop in this case
<Regenaxer> Because building a huge list just to sum it and then throw it away is a waste
<cilz> I'll take that as an exercise
<Regenaxer> good :)
<cilz> thanks
<Regenaxer> But right, here 'uniq' and 'range' are the simplest (shortest)
<Regenaxer> So it is OK
<Regenaxer> Just 'apply' should be used with care perhaps
<Regenaxer> It uses the stack to build structures
<cilz> it was for the first task of the Euler project here https://projecteuler.net/archives
<Regenaxer> cool
<Regenaxer> *if* you go with the above, then at least take 'conc' instead of 'append' to avoid needless copy of the first list
<cilz> ok
<Regenaxer> BTW, the call is $ ulimit -s unlimited best in .bashrc
<cilz> thanks, for the time being I did that in the terminal
<Regenaxer> :)
<cilz> I don't plan to use the stack like this every time!
<Regenaxer> yeah
<Regenaxer> A big stack is needed sometimes though
<cilz> yep
<Regenaxer> eg. for lots of coroutines or heavily recursive programs
<cilz> I just saw that in the docs
<Regenaxer> ok
<cilz> what about this (eval (cons '+ (uniq (conc (range 3 999999 3) (range 5 999999 5)))))
<cilz> a bit faster than with apply
<Regenaxer> yes, this is also ok, but only in this case. Because of the numeric arguments
<Regenaxer> Normally the results are not equivalent
<cilz> ah?
<Regenaxer> (eval (cons 'println '(car cdr))) <-> (apply println '(car cdr))
dtornabene has quit [Remote host closed the connection]
<Regenaxer> The purpose of 'apply' is to evaluate the elements in the list *not*
<cilz> I'm not sure to fully understand your last point
<Regenaxer> (eval (cons 'fun '(a b c))) evaluates the expression (fun a b c)
<Regenaxer> so 'a' etc. is evaluated before 'fun' is called
<Regenaxer> This is not wanted usually
<cilz> that's ok
<Regenaxer> The results are different:
<Regenaxer> : (eval (cons 'println '(car cdr)))
<Regenaxer> 268657 268660
<Regenaxer> : (apply println '(car cdr))
<Regenaxer> car cdr
<cilz> ok, ok
<beneroth> hi Regenaxer, cilz :)
<cess11_> Good morning.
<beneroth> Good morning cess11_
<cilz> hello
<Regenaxer> Hi beneroth!
<cess11_> I think Forth and pil will be among the best tools for this coming world.
<Regenaxer> yep
<cess11_> Because these morons will actually pump out these little stickers with PIII-like performance and all sorts of weird hardware interfaces, so scripting pil style and driver hacking Forth style will be quite useful.
<aw-> cess11_: thanks for the link
<beneroth> "The computer will cost less than ten cents to manufacture" wow. impressive
<beneroth> thx cess11_
<aw-> yeah
<aw-> awesome for pil
<Regenaxer> indeed
<aw-> totally
<beneroth> Regenaxer, " It's intended to help track the shipment of goods" sounds like our friends
<Regenaxer> exactly
<Regenaxer> supply chain
<cilz> with regards to privacy, I'm not sure to be happy with that though "Within the next five years, cryptographic anchors [...] will be embedded in everyday objects and devices,"
<Regenaxer> yes, evil
<cilz> ...
<cilz> yes it's a big concern
<aw-> isn't a SIM card as powerful as an x86 from the 90s?
<cess11_> Yeah, it'll be quite horrific but also something we'll need to cope with because such a large being as IBM can't stop itself.
<aw-> faraday cages for all
<cilz> hélas!
<cilz> aw-: right
<cess11_> SIM are tricky and a bit clunky and not as general purpose as these coming products.
<cess11_> But yes, there's some interesting aspects to putting some NFC in all of them.
<aw-> hmmm there's no details on this IBM thing
<aw-> does it actually have an x86 instruction set?
<Regenaxer> Let's hope it is arm64 ;)
<aw-> or RISC-V
<aw-> this article is so bad, it lacks any details
<beneroth> cess11_, privacy will become more and more a luxury for rich guys only (so no benefits of transparency, first some ages of darkness and surveillance I would say).
<cess11_> I think it is something not x86 but comparable. They haven't told much yet and all they say is that they have prototypes.
<cess11_> beneroth: Yeah, and hackers that mess up their own identities in weird ways to avoid profiling and whatnot.
<beneroth> just wait until everyone and their dog analyses every DNA trace someone leaves behind.
<cess11_> Yep.
<aw-> beneroth: agree
<aw-> cess11_: like in Minority Report
<beneroth> aye
<cess11_> But these things will come. It will be seen as necessary by logistics companies that manage the big ports.
<beneroth> and who doesn't want personalized ad posters in the street instead of boring ads? eh?
<cess11_> Those who understand them.
<cess11_> It will be interesting to see how the big companies will react to ransomware type attacks that undermine or destroy their blockchain thingies further down the line.
<aw-> beneroth: then I can finally have a valid reason to start marketing my adblocker eyewear invention
<beneroth> cess11_, well already happened, see the NotPetya attack.
<beneroth> aw-, blinkers?
<cess11_> Yeah, but we aren't reliant on such techniques being wielded by companies yet, which we most likely will be unless the next financial crash halts this.
<beneroth> T
<aw-> beneroth: haha
<cess11_> And I'm not so sure we'll see such crashes like the Depression, Recession, Flash Crash and so on again, because it might be possible to channel those disturbances through military spending and destruction.
<beneroth> I also think that the widespread destruction attacks by hackers was largely only test runs yet.
<cess11_> Yep.
<beneroth> the IBM thingy: power: photo-voltaic cell. wow.
<cess11_> Yeah. And then they will make them water resistant and able to take power from liquids as well, so we'll fill the oceans with networking little computers attached to garbage.
<beneroth> someone has to be build the Matrix.
<beneroth> btw. the "let's gamify society to achieve voluntary obedience" is going to the next step: https://www.theverge.com/2018/3/16/17130366/china-social-credit-travel-plane-train-tickets
<beneroth> lets see where this project is in 10 years. and if we start to copy it.
<cess11_> We already have. Just not in the police but in the secret services and banks and insurance companies.
<cess11_> They try it out on its muslim and black minority communities, insurance companies and firms like PwC pick out the raisins and apply them where the law allows.
<cess11_> *their
<beneroth> I would say the error is that the law allows it. the law is the angle to improve this. companies are like AIs, working towards their central optimization function (usually: making money) within the possibilities the system/game allows.
<beneroth> the introduction of computer AI will only accelerate this, not change it somehow.
<beneroth> related xkcd: https://xkcd.com/1968/
<aw-> question for Lispers
<aw-> I have this: (mapcar pack (split (chop "dest=loc") "="))
<aw-> returns ("dest" "loc")
<aw-> how do I get ("dest" . "loc") instead?
mtsd has joined #picolisp
<beneroth> is it always key=value, so always only 2 elements?
<aw-> yes
<beneroth> it's surprisingly hard to turn a list into a cons :D
<aw-> sorry, no
<beneroth> this would be a solution for your case: (let Lst (split (chop "dest=loc") "=") (cons (pack (car Lst)) (pack (cadr Lst))))
<aw-> not always key=value
<aw-> sometimes key=value=othervalue...
<beneroth> ok, then the solutions doesn't work
<beneroth> separator is always "=" ?
<aw-> yes
<aw-> well.. it's split with "="
<aw-> haha
<beneroth> T
<aw-> ok never mind, i will make changes to avoid this
<aw-> thanks!
<beneroth> ?
<beneroth> why must the output be a pair and not a list?
<beneroth> your question is essentially to turn '(a b c d e) into '(a b c d . e)
<beneroth> (setq L '(a b c d e))
<beneroth> : (conc (head -2 L) (cons (car (tail 2 L)) (last L)))
<beneroth> -> (a b c d . e)
<beneroth> (let L (mapcar pack (split (chop "dest=loc") "=")) (conc (head -2 L) (cons (car (tail 2 L)) (last L))))
<beneroth> -> ("dest" . "loc")
<beneroth> (let L (mapcar pack (split (chop "dest=loc=north") "=")) (conc (head -2 L) (cons (car (tail 2 L)) (last L))))
<beneroth> -> ("dest" "loc" . "north")
<beneroth> but if "=" is missing in the string, you end up with '("dest" . "dest"))
<beneroth> or '(NIL)
<cess11_> If I understand the problem something like 'car to grab the key from the '(mapcar pack generated list and then either 'for or more mapping on the 'cdr would work?
<aw-> too complicated
<aw-> i simplified
<aw-> no need for all this
alexshendi has quit [Read error: Connection reset by peer]
<beneroth> good :)
<beneroth> simpler is better
libertas has quit [Ping timeout: 240 seconds]
libertas has joined #picolisp
libertas has quit [Ping timeout: 256 seconds]
<Regenaxer> ret
orivej has joined #picolisp
<m_mans> Hi all!
<Regenaxer> How about (and (mapcar pack (split (chop "dest=loc") "=")) (cons (car @) (last @] ?
<Regenaxer> Hi m_mans!
libertas has joined #picolisp
<aw-> Regenaxer: is there a way to prevent pil from converting \n to ^J ?
<Regenaxer> in the reader?
<aw-> yes?
<Regenaxer> or (use (@K @V) (and (match '(@K "=" @V) (chop "dest=loc")) (cons (pack @K) (pack @V]
<Regenaxer> The behavior of the reader is hard coded
<Regenaxer> I thought you parse it yourself?
<Regenaxer> character-wise?
<aw-> wjat
<aw-> wait..
<Regenaxer> Only 'read' (and 'str') handle such escapes
<aw-> damnit
<aw-> i knew something was off
<Regenaxer> :)
<Regenaxer> $ cat a
<Regenaxer> a\nb
<Regenaxer> $ pil +
<Regenaxer> : (in "a" (line))
<Regenaxer> -> ("a" "\\" "n" "b")
<aw-> why only 'read' and 'str' ?
<Regenaxer> Lisp s-expression syntax parsing
<Regenaxer> string syntax here
<Regenaxer> (char) (line) etc. don't
<aw-> why does 'str' do it?
<Regenaxer> it can "read" from a string
<Regenaxer> same machinery as 'read' internally
<Regenaxer> comments etc
<Regenaxer> 'str' always reads list though
<aw-> :(
<Regenaxer> It is used by eg. $ pil -'foo bar'
<aw-> where can I find the list of other characters such as ^J ?
<aw-> and ^T etc
<Regenaxer> : (str "a b # c")
<Regenaxer> -> (a b)
<Regenaxer> There is no such list
<Regenaxer> ^ denotes a CTRL character
<Regenaxer> in (read)
<Regenaxer> in strings only
<Regenaxer> for internal symbols ^ is not special
<aw-> the list of ctrl characters which start with ^
<aw-> there's a list
<aw-> there must be
<Regenaxer> no
<Regenaxer> it is simply 64 substracted
<aw-> so how does it know that \n is ^J ?
<Regenaxer> ASCII 1 ... 32
<Regenaxer> ah backslash
<Regenaxer> there are only a few
<Regenaxer> \n \r and \t I think
<aw-> and \0 ?
<Regenaxer> See testEscA_AF in @src64/io.l
<Regenaxer> It is newline, return, tab and decimal
<aw-> ah ok
<aw-> ok thanks! i think another bug in my JSON lib :( .. it shouldn't convert \n to ^J .. it should be \\n
<Regenaxer> \n to ^J seems good to me (?)
<aw-> it works fine for converting JSON -> Pil.. but it can't go back from Pil <- JSON
<aw-> Pil -> JSON ^J should become \n, not a literal newline haha
<Regenaxer> I see, JSon does not take ^J
<aw-> it does but when you (prinl "^J") it makes a newline
<Regenaxer> but it is a newline, you could just keep it
<Regenaxer> yes
<aw-> no, nobody knows what ^J is
<aw-> it should go back to \n
<Regenaxer> gives a line break
<Regenaxer> so you convert a 3-liner to "abc\ndef\nghi\n" ?
<Regenaxer> I would keep the lines as they are
<aw-> i'm not sure yet..
<aw-> i need to run some tests
<Regenaxer> Depends on the use case perhaps
<aw-> you can't keep the lines.. JSON doesn't handle newlines
<Regenaxer> BTW, the \ syntax is in doc/ref.html#transient-io
<Regenaxer> The combination of a backslash followed by 'n', 'r' or 't' ...
<Regenaxer> There cannot be a newline in a JSon string?
<aw-> it's invalid according to the spec
<aw-> only space is allowed
<Regenaxer> I see
<aw-> ok thanks, i'll look at the doc and adjust my code
<Regenaxer> For what purpose do you use JSon?
<aw-> everything
<Regenaxer> Have you looked at 'jq'? I used it for some JSON operations
<aw-> i have a bunch of stuff written in various programming languages, it's the only stable/universal format that I can trust to be interpreted correctly
<aw-> yes i've used jq on some systems.. it's just a command-line program for JSON
<Regenaxer> right
<aw-> i don't want to shell out to that each time i process JSON in picolisp
<Regenaxer> It makes life easier by filtering JSon
<Regenaxer> ok, right
<aw-> ideally everything everywhere would be in PicoLisp, then we wouldn't need JSON ;)
<aw-> also I stay away from YAML, it's an abomination
<Regenaxer> You don't have control over the other systems? Cause then I would use PLIOT on all instead
<Regenaxer> PLIO
<aw-> no i don't
<Regenaxer> ok
<aw-> some windows, some not even interpreted by me
<aw-> yes PLIO is nice
<Regenaxer> PLIO is the easiest too
<Regenaxer> There is Java, C and JS
<Regenaxer> (JS only reading iirc)
<cess11_> Sometimes what one wants in the JSON is easier to pick out with expressions like '(match '(@Key ":" @Value) ... or something similar rather than generic JSON parsing.
<cess11_> Not sure if this could apply but I commonly do this and then build objects from the interesting parts.
<cess11_> The other way is fairly straightforward, though I haven't had any newline troubles yet.
<Regenaxer> yes, I also prefer just to pick out things instead of full parsing
<Regenaxer> but then with 'from' and 'till'
<Regenaxer> is also the fastest by far
<cess11_> Yeah, on occasion I want to keep a subroutine and rewrite it to 'from, 'till structures instead of 'match, which I use for flexibility at the REPL.
<cess11_> s/structures/expressions
<Regenaxer> eg pick the bitcoin rate from a JSON site:
<Regenaxer> (in '("@bin/ssl" "blockchain.info" 443 "de/ticker")
<Regenaxer> (from "\"EUR\"")
<Regenaxer> (from "\"last\" : ")
<Regenaxer> (let
<Regenaxer> (Rate
<Regenaxer> (format
<Regenaxer> (pack (till ".,\n") (or (till ",\n") "."))
<Regenaxer> *Scl ) ]
<Regenaxer> No need do do the full parsing
<Regenaxer> The above fragment is out of context
<Regenaxer> better:
<Regenaxer> (in '("@bin/ssl" "blockchain.info" 443 "de/ticker")
<Regenaxer> (from "\"EUR\"")
<Regenaxer> (from "\"last\" : ")
<Regenaxer> (format
<Regenaxer> (pack (till ".,\n") (or (till ",\n") "."))
<Regenaxer> *Scl ) ]
<Regenaxer> (just for the records ;)
<cess11_> It's a good snippet.
<Regenaxer> He meant the possible escapes with '\' in pil reader
<beneroth> ah, ok. I understood him looking for the definitions of ^J ^T ^etc
<beneroth> which is basically just a hax-like mapping of the binary ascii value, as you said :)
<Regenaxer> yep
shpx has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
shpx has joined #picolisp
mtsd has quit [Quit: Leaving]
<aw-> thanks beneroth, as Regenaxer said i was looking for the ones in the pil reader
<aw-> it's fixed now, I'll push a new version of json lib shortly
shpx has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
alexshendi has joined #picolisp
<beneroth> aw-, cool. I'm using your lib, so we will upgrade :)
<aw-> Regenaxer: i decided to keep ^J ^M ^I once converted to pil
<aw-> and instead, i just convert them back to \n \r \t when generating JSON
<aw-> this way it works as expected in both places
<Regenaxer> great
<aw-> so ^J -> \n, and \n -> ^J
<Regenaxer> With ^J you mean the character "^J" in pil?
<Regenaxer> ie (char 10) ?
<aw-> yes
<aw-> yes exactly
<Regenaxer> ok
<Regenaxer> For pil it is the natural (or only) representation
<Regenaxer> \n is only a read-macro kind of
<Regenaxer> What really exists is only the unicode char 10
<Regenaxer> And in pil it is always in the name of a symbol
<aw-> yes i use the unicode chars in other places
<Regenaxer> It is *all* unicode
<Regenaxer> (char 65) = "A" is also unicode
<Regenaxer> In this range (till 127) unicode and ascii are the same though
<beneroth> it's about '("^J") vs (char 10) vs '("\" "n")
<beneroth> so JSON standard is bascially saying that (char 10) should be encoded as the two characters "\" "n" within JSON (because (char 10) in JSON is reserved and/or illegal), aw- ?
<Regenaxer> What I want to say is that \n or ^J or (char 10) are always a byte with the value decimal 10 in a symbol's name. The rest is only representation for I/O
<Regenaxer> "\10\" is also legal in pil
<Regenaxer> 'print' always outputs "^J" in pil
<Regenaxer> only 'read' is so variable
<Regenaxer> (all for the records, you all probably know ;)
<aw-> ok
<aw-> beneroth: no it's not
<aw-> it's about \n vs ^J
<Regenaxer> in JSON only \n is legal as far as I understood
<Regenaxer> In pil itself there is no such thing, only a byte 10
<aw-> only \n?
<aw-> no
<aw-> i dont know what you guys are talking about
<aw-> i'm very confused
<Regenaxer> You said ^J is not understood by JSon
<aw-> i never said that
<Regenaxer> only \n
<Regenaxer> oh
<aw-> JSON is not a language
<Regenaxer> So it accepts ^J too?
<aw-> i don't know how that sstatement makes sense
<Regenaxer> It is a syntax
<aw-> of course it does
<aw-> it doesn't accept literal new lines within a string
<aw-> that's all
<beneroth> why not?
<beneroth> so it doesn't accept (char 10) within a string
<beneroth> right?
<aw-> ex: {"test":
<aw-> "res"} == invalid JSON
<aw-> a new line here
<beneroth> so in JSON, "^J" and "\n" is not exactly the same thing as (char 10), while in pil it (usually) is the exact same thing
<Regenaxer> I don't believe Json accepts two characters ^ and J to mean "newline character"
<aw-> err sorry
<aw-> that's allowed
<Regenaxer> ^J is only pil
<aw-> ex: {"test": "str
<aw-> str"} == invalid JSON
<aw-> a new line here
<aw-> JSON is NOT a language
<aw-> i don';t know why you keep saying that
<Regenaxer> right, but it is a syntax
<Regenaxer> " delimits strings
<Regenaxer> etc
<Regenaxer> " is not part of the string data, it is syntax
<aw-> you can very well have a JSON string like: {"test":"^J"}
<aw-> that's fine
<aw-> nowhere does it say you can't do it
<Regenaxer> I think you confuse it
<aw-> and you can even have {"test":"\"^J\""}
<aw-> but you CAN NOT have: {"test":"st
<aw-> rrr"}
<Regenaxer> I think you mean "^J" when sent from pil
<Regenaxer> which is *not* two characters ^ and J
<aw-> i mean the actual ^J, not a newline
<aw-> please stop
<aw-> we're arguing about nothing
<aw-> it's fine
<aw-> it works
<aw-> this is a pointless discussion
alexshendi has quit [Read error: Connection reset by peer]
<Regenaxer> hmm, not very convincing
<Regenaxer> Where did you find that ^J is newline in JSon?
<aw-> just read the SPEC
<aw-> don't ask me
<aw-> IT'S NOT
<aw-> for the 10th time i keep telling you it's not
<Regenaxer> 15:14 <aw-> you can very well have a JSON string like: {"test":"^J"}
<aw-> yes, the characters ^ and J placed next to each other
<aw-> is 100% allowed
<aw-> i keep telling you, a NEWLINE is not allowed
<aw-> newline
<aw-> in the string
<Regenaxer> But this is a string with 2 chars, not a newline in JSon
<aw-> not the characters ^J, but an )___actuall___( newline
<aw-> yes, right
<Regenaxer> So {"test":"\n"} is not allowed?
<aw-> yes it is
<aw-> it's in the spec
<aw-> the string can have \b \f \n
<Regenaxer> and {"test":"^J"} too? Meaning newline?
<aw-> and \r \t and \uNNNN
<Regenaxer> right
<aw-> NO, you can NOT put a control character in a JSON string
<Regenaxer> but "^J" does not mean newline in JSon
<Regenaxer> But \n *is* a control character
<Regenaxer> I think \n is like in C
<Regenaxer> but ^J not
<Regenaxer> ^J is only in pil
<Regenaxer> IIRC I invented it
<Regenaxer> well, not invented
<Regenaxer> CP/M did the same
<Regenaxer> or bash
<Regenaxer> $ stty -a
<Regenaxer> speed 38400 baud; rows 48; columns 80; line = 0;
<Regenaxer> intr = ^C; quit = ^\;
<Regenaxer> but *not* in JSon I would expect
<Regenaxer> Would be very uncommon for those curly-braces languages
<Regenaxer> C, JS, Java etc
<Regenaxer> They all don't use '^' to denote control
<aw-> we need to ensure we don't confuse control characters with two characters which look like a control character
<Regenaxer> So I would be very surprised if JSon would do
<aw-> in all my discussions, I've never once said that JSON accepts control characters
<aw-> because it doesn't
<Regenaxer> ok
<aw-> if you want to encode a newline
<aw-> as you know, in pil it's ^J
<Regenaxer> only in the reader
<aw-> then you should use \ and n
<aw-> next to each other
<Regenaxer> it is 10 in pil
<aw-> yes
<Regenaxer> So what confuses me is your expression {"test":"^J"}
<aw-> this explains it quite clearly
<aw-> sorry, when I wrote that, i really meant ^ and J next to each other
<Regenaxer> oops
<aw-> not ^J the control char that's used by picolisp for a newline
<aw-> or (char 10)
<aw-> can you see the image?
<Regenaxer> yes, but ^ and J next to each other have no meaning in JSon format
<Regenaxer> other programs
<aw-> right
<Regenaxer> Thats what confused me
<Regenaxer> why should ever appear {"test":"^J"} in JSon?
<aw-> well why would i say that ^J (the control char) is allowed after saying 20 times that newlines are not allowed?
<aw-> doesn't make sense
<Regenaxer> It is not about allowed
<Regenaxer> ^ and J are of course allowed
<Regenaxer> It is about meaning
<aw-> yes
<Regenaxer> Ah, so you want to encode Pil expressions in JSon?
<aw-> :D yes, done!
<aw-> my last hurdle was ^J ^M and ^I
<Regenaxer> I see
<Regenaxer> Then theoretically other control chars may also appear
<Regenaxer> best to encode them as \nNNNN
alexshendi has joined #picolisp
<Regenaxer> I have eg in this irc client strings like "^[[0;31m"
<aw-> that should be fine
<aw-> it'll output ^[ and "\^["
<beneroth> <Regenaxer> Ah, so you want to encode Pil expressions in JSon?
<beneroth> ah
<beneroth> Regenaxer, how to encode the literal 2 char string "^J" in pil btw? "\^J" ?
<Regenaxer> yep
<aw-> maybe
<beneroth> tested. right.
<Regenaxer> ^ must be escaped
<Regenaxer> \ also
<Regenaxer> so \\ like in C etc
<Regenaxer> or " as \"
<Regenaxer> I think these are all
<Regenaxer> " \ and ^
<aw-> seems fine
<aw-> relics from the early computing days
<Regenaxer> The ^ for control?
<cess11_> sc
cilz has quit [Ping timeout: 256 seconds]
<cess11_> That was for a MUD.
aw- has quit [Quit: Leaving.]
orivej has quit [Ping timeout: 256 seconds]
alexshendi has quit [Read error: Connection reset by peer]
alexshendi has joined #picolisp
shpx has joined #picolisp
stultulo has joined #picolisp
f8l has quit [Ping timeout: 252 seconds]
stultulo is now known as f8l
libertas has quit [Ping timeout: 276 seconds]
rob_w has quit [Read error: Connection reset by peer]
nonlinear has joined #picolisp
libertas has joined #picolisp
dtornabene has joined #picolisp
<beneroth> LOL
<beneroth> Bitcoin is legally DEAD
orivej has joined #picolisp
<beneroth> so everyone who stores a bitcoin wallet on its computer is probably a child abuser (and probably even distributor of child porn?) in most jurisdictions
<beneroth> s/its/their
<beneroth> correction: I'm not sure if a wallet does necessarily contain the chunks of the blockchain which could be interpreted as child porn.
dtornabene has quit [Remote host closed the connection]
dtornabene has joined #picolisp
dtornabene has quit [Quit: Leaving]
dtornabene has joined #picolisp
dtornabene has quit [Quit: Leaving]
dtornabene has joined #picolisp
orivej has quit [Ping timeout: 265 seconds]