<alexgordon>
when you write code, you're writing for two separate events: compile-time and runtime. A dynamic language like say Python doesn't execute code at "compile" time. With a statically typed language you can generate errors at compile-time.
<alexgordon>
question is: how far can you push that?
<alexgordon>
jai can run code at compile-time, that's a step up from just checking correctness
<alexgordon>
but if you take it to an extreme and run your whole program at compile-time then you're back to square one :)
<brixen>
or square zero, depending on indexing
<brixen>
alexgordon: I've been thinking about the difference between "authoring" code and "running" code for a while
<brixen>
I've got some ideas to experiment with in Rubinius
<whitequark>
alexgordon: I've tried writing a statically typed dialect of Ruby where the "compile-time" phase was done using abstract interpretation of an IR
<whitequark>
the bottom line is that once you start inlining functions, it becomes essentially impossible to provide error messages that are helpful at all
<whitequark>
like, something like attr_accessor would generate a closure that does a send to itself, but before runtime, the IR transformations would notice that in the send("#{attr}=", ...), attr always has the same value, and so it can constant-fold it, and then send can go away, etc
<whitequark>
it is bad enough (but at least theoretically possible) to report errors deep inside a virtual call (really inline) stack, which just requires tracking a massive amount of source spans
<whitequark>
but the really worst part is trying to explain why something *didn't* fold, which is I'm pretty sure just impossible
<alexgordon>
whitequark: I guess compilers (or rather code run at compile-time) exists for a variety of purposes. Verifying the code is correct [type checking], optimising the code, generating new code [preprocessing&templates], and build automation.
<alexgordon>
so it depends what your goal is
<whitequark>
alexgordon: none of those are really separate
<whitequark>
languages are hard because almost anything you can do has global effects and they interact with each other
<alexgordon>
they can be separate insofar as some languages don't have any of them
* Hrorek
doesn't understand a word of what is being talked about here
<alexgordon>
people are quite happy to sacrifice verification in order to use javascript
<alexgordon>
Hrorek: mainly 5am nonsense
<alexgordon>
whitequark: I've mostly come to terms with the fact that if I want _fast_ code, then C++ has every feature I could need
<whitequark>
alexgordon: this is why no one uses typescript. oh wait
<alexgordon>
what I'm more interested in these days is how to add stuff to dynamic languages
<alexgordon>
rather than make a static language that does everything
<whitequark>
you can't add stuff to dynamic languages. the stuff that's already /in/ dynamic languages prevents anything useful from being added
<alexgordon>
ok, not add stuff _to_ dynamic languages, but make a dynamic language that works in a different way
<brixen>
alexgordon: different how?
<whitequark>
some parts of language design *are* a zero-sum game. become able to express computation in more ways, become less able to express anything about that computation
<alexgordon>
if you want interoperability you do limit your options substantially
<alexgordon>
brixen: well e.g. lots of people use compilers with js (like babel or whatever), but I want more of a conversation with the compiler, than just being a stage that you send code through
<brixen>
conversation about what?
<alexgordon>
building the code
<alexgordon>
like in jai, where you can dedicate parts of the file to be executed at compile-time
<brixen>
I see
<alexgordon>
but not just running code, but changing how the compiler is compiling it
<brixen>
what's the purpose of changing how the compiler is compiling it?
<brixen>
aren't you at some point just programming the compiler?
<alexgordon>
is there a difference?
<brixen>
s/programming/writing/
<alexgordon>
so yes
<brixen>
that's what I'm asking
<brixen>
why is there a difference?
<alexgordon>
there shouldn't be!
<brixen>
why is there? :)
<alexgordon>
brixen: this goes back to 2013 when I was writing a compiler in C++, and it is pretty tedious, so I started writing a python script to write the compiler for me
<alexgordon>
I made a bunch of text files, then parsed the text files and generated C++
<brixen>
alexgordon: have you seen vpri.org?
<whitequark>
mmm, OMeta
<alexgordon>
anyway it was cool because it was like a parametric compiler. you could describe the language in sort of natural language, run the spaghetti-like python script and you got a lexer, parser, and parse tree as output
<whitequark>
alexgordon: oh that actually *is* cool
<whitequark>
I know several companies which do this for gateware
<brixen>
alexgordon: you may find Puimarta et al's idea of "mood specific languages" interesting
<alexgordon>
brixen: I'll check it out
<whitequark>
you declaratively specify your processor's ISA and it generates an assembler, compiler, linker, verification suite, etc
<whitequark>
the gateware too, of course
<alexgordon>
cool
<brixen>
alexgordon: fundamentally, I think what you are seeing is that "general purpose language" is the biggest lie the devil ever told
<brixen>
what you have is a language problem and what you wish for are language tools
<brixen>
I'm trying to work on that in Rubinius :)
<brixen>
also, a "pass" in the compiler doesn't need to be constraining necessarily
<brixen>
you may want to read on nano-pass architectures for compilers
<alexgordon>
brixen: ah that nano-pass stuff is interesting
<alexgordon>
which is mostly what I have converged to with my compiler writing
<brixen>
alexgordon: some days I get very excited, and some days it's very bleak
<brixen>
consider this, has anyone done an economic analysis of TC39?
<brixen>
like, how much in economic terms does every feature they debate cost (ie opportunity cost for the people involved yacking vs working)
<brixen>
but then, next to Ruby's language design process, TC39 looks like mars missions or something
<brixen>
anyway, the point is, people continue to think that general purpose languages are the correct way to program
<brixen>
but I think they are close to the most inefficient way possible
<brixen>
they drive all sorts of up-front cost and conflict that no one seems to question
<brixen>
and the results are abysmal
<brixen>
used any software lately? :p
<alexgordon>
haha well it depends on the use entirely. my requirements for a language for working by myself, and a language for working with other people are completely different
<alexgordon>
when I work with other people I want clear, simple, mature language with no frills
<alexgordon>
but when I work by myself I want _macros_everywhere_
<glowcoil>
whitequark: hi!
<whitequark>
hi glowcoil
<alexgordon>
typescript is probably close to ideal for the former category. Some language with a type system that is not crazy like scala or haskell (haskell is probably the worst possible choice for working with other people)
<whitequark>
ime other people, without any qualifiers, will hate literally any language you might choose
<brixen>
alexgordon: indeed, you want a language depending on context, but I'm guessing you don't want a different package manager, debugging interface, tool chain, etc
<brixen>
but that's what you're pretty much stuck with
<alexgordon>
whitequark: oh I don't care what they think, just that we can read each other's code :P
<whitequark>
... overall, platinum- or tin-catalyzed castings result in near-net-shape part and reproduce the texture of the mold very precisely, so surface finish is of foremost importance
<alexgordon>
LOL
<purr>
LOL
<alexgordon>
I'll send it to elliott xD
<ELLIOTTCABLE>
welp. first day of classes complete.
<ELLIOTTCABLE>
katymoe: cs lol
<ELLIOTTCABLE>
katymoe: and like, eventually math maybe? idk?
<ELLIOTTCABLE>
“4:16 PM <+whitequark> don't forget to smoothen the print in acetone or the surface will come out feeling gross”
<brixen>
ELLIOTTCABLE: so, a very minor psychological digression: I'm always fascinated by people in Ruby who are like "static types, haskell, maths yay rocks" and then "I hate nil, make it die"
* ELLIOTTCABLE
eyebrow
<ELLIOTTCABLE>
whyzat
<ELLIOTTCABLE>
also omg lol
<brixen>
it's a wonder I haven't developed physical ticks from some of this stuff
<ELLIOTTCABLE>
a CS-PhD-turned-mathematician-turned-professor had a whole discussion with me about haskell and frans
<ELLIOTTCABLE>
because I showed up to a class that was canceled, and he showed up to make sure everybody knew it was canceled, and I was the only one who didn't, so I had him to myself for like thirty minutes
<ELLIOTTCABLE>
and basically he was like “LOL I saw, and hated the same thing. those academic languages are CS graduates regretting not being math graduates. THOSE SILLY WANNABES.”
<purr>
LOL
<ELLIOTTCABLE>
paraphrased, but it was really validating that a Real Mathematician agrees with me that Math Programming isn't real, general, solution
<whitequark>
brixen: traceable nils are a very cool solution
<brixen>
whitequark: thanks, I think they have some potential
<whitequark>
I'm now thinking where else embedding source locations into values would come useful
<brixen>
almost anywhere :)
<brixen>
object allocation traces are another
<whitequark>
hmyeah
<brixen>
planning to use the inflated header mechanism for that
<brixen>
in fact, the inflated header mechanism is a general mechanism for this sort of thing that I want to implement
<brixen>
also to do "generation" analysis on the heap
<whitequark>
I wonder if you could use shadow memory (of the asan kind) to trace the values that touched at some moment
<brixen>
basically, I want to stream an abstraction of the object graph from the process continually
<whitequark>
to answer "where did this field come from"
<brixen>
whitequark: that's kind of the idea, but I'm using the network as the sink
<whitequark>
I always thought it's a bit weird that we have so many tools to instrument code now, but almost nothing to instrument data
<brixen>
since the object graph represents the call graph in Rubinius, you actually have the intersection of those two fundamental graphs
<brixen>
whitequark: totally!
<brixen>
an idea of a "JIT" for data layout and contiguity in the heap is something I've been thinking about
<alexgordon>
nil is hardly ever an issue for me and I don't get why people are so damn petrified of the things
<brixen>
Rick Hudson, the guy who recently started working on the Go GC, talked to me about that a few years ago at JS conf
<brixen>
basically said, what we know of GC is actually very basic and there's lots of improvements to make
<whitequark>
brixen: are you familiar with v8 hidden classes?
<brixen>
whitequark: the basic idea, yes
<alexgordon>
if you don't write functions that return nil... you won't have to deal with nil, in your own code at least
<ELLIOTTCABLE>
brixen: I ...
<brixen>
I have not looked at their implementation
<whitequark>
brixen: yeah, so that would be the equivalent of basic inline caches
<ELLIOTTCABLE>
So far, the only thing I unreservedly agree with is this sentence:
<ELLIOTTCABLE>
“Why is it acceptable to ignore exceptions in one area of the code, but not in another?”
<whitequark>
brixen: I want... tracing JIT for data
<whitequark>
inlining for data!
<brixen>
whitequark: me, too
<ELLIOTTCABLE>
4:42 PM <brixen> basically, I want to stream an abstraction of the object graph from the process continually
<ELLIOTTCABLE>
what does this mean
<brixen>
[pointer, pointer, pointer] => [x1,y1,x2,y2,x3,y3] sort of automatic transforms
<ELLIOTTCABLE>
brixen is like a dirty cross between whitequark and I
<ELLIOTTCABLE>
it's a little scary
<ELLIOTTCABLE>
-clouds
<purr>
ELLIOTTCABLE: is stuck up in the clouds; hilight 'em if you want 'em.
<brixen>
ELLIOTTCABLE: I thought this would be your favorite line, "Sometimes developers from that land visit Ruby and laugh." :(
<ELLIOTTCABLE>
4:42 PM <brixen> basically, I want to stream an abstraction of the object graph from the process continually
<ELLIOTTCABLE>
4:43 PM <brixen> whitequark: that's kind of the idea, but I'm using the network as the sink
<ELLIOTTCABLE>
4:43 PM <brixen> since the object graph represents the call graph in Rubinius, you actually have the intersection of those two fundamental graphs
<ELLIOTTCABLE>
ahhaha
<brixen>
ELLIOTTCABLE: basically, it means streaming a graph representation of the heap showing a "skeleton" of sorts, the nodes and edges minus the other details
<ELLIOTTCABLE>
paws. :3
<ELLIOTTCABLE>
So, I can basically tell people that Paws is ‘Rust's shared-memory-management concerns crossed with Brian's plans for Rubinius.’
<ELLIOTTCABLE>
because between the two of them, so far, you get more than, like, 70% of my plans.
<brixen>
s/pointer, pointer, pointer/point, point, point/ where point => Point.new(x,y)
<brixen>
ELLIOTTCABLE: heh, fun
<whitequark>
ELLIOTTCABLE: since when are you ok with an extremely restrictive typing discipline?
<ELLIOTTCABLE>
brixen: oh the visit-ruby-and-laugh is funny too :P
<ELLIOTTCABLE>
whitequark: I'm not interested in *contributing to* statically-typed languages. Anymore than I am in low-level or speed-optimized languages.
<ELLIOTTCABLE>
That's not the same thing as me thinking they're not *super relevant*
<ELLIOTTCABLE>
or s/relevant/great/
<brixen>
I want extremely restrictive type disciplines but all in their free-range, organic, hand-crafted little boxes to limit their systemic damage :)
<alexgordon>
do you even know any statically typed languages elliott?
<ELLIOTTCABLE>
Like, one of the only things I look at and think is *terrible* and *silly*, is Haskell / purity.
<ELLIOTTCABLE>
don't conflate my dismissive views on those things, with my “I'm not interested in this” reaction to strongly-typed languages sshrug
<ELLIOTTCABLE>
or to rephrase that:
<brixen>
some of the most interesting type work is coming from Wadler's interest in linear types colliding with the Haskell type system
<ELLIOTTCABLE>
I am under no illusions that I can ever ‘beat’ (or more accurately, even begin to contribute to / improve upon) Rust. :P
<brixen>
that an Idris et al
<brixen>
s/an/and/
<whitequark>
ELLIOTTCABLE: well, you'll have to get interested in them to design a usable language with regions
<whitequark>
is all i'm saying
<whitequark>
regions are like, the most cross-cutting concern rust has. it influences half of the language directly and the other half through library design
<brixen>
it's not hard to improve on Rust and people are doing that quite activeely
<ELLIOTTCABLE>
whitequark: regions? I don't *know* Rust very well; but every time I see it referenced to, it's by far the closest thing to my concurrency work I've ever seen.
<brixen>
which is not to discount Rust, but it's not revolutionary per se
<ELLIOTTCABLE>
brixen: Like, people with compiler-design chops. That I absolutely don't have. :P
<brixen>
ELLIOTTCABLE: heh, give it a little time, you'll get it
<brixen>
once you get bitten, you can't give it up
<ELLIOTTCABLE>
but the Node/Ruby/Python/Lisp scene, the higher-level / expressive / slower / dynamic languages, particularly imperative ones, are a *clusterfuck*
<whitequark>
ELLIOTTCABLE: Rust's 'shared-memory-management concerns' are regions
<brixen>
you'll be a vampire
<brixen>
basically
<ELLIOTTCABLE>
and it seems like so few people are trying to improve them *right now*, because in the early 'teens everybody interested in PLT got distracted by type-systems :P
<ELLIOTTCABLE>
so that's a place where I have real interest, because I've *real improvements* to make, real impact to share.
<ELLIOTTCABLE>
whitequark: sshrug you'll have to tell me more, or just wait until I've had time to sit down and read a book on rust.
<whitequark>
you have linear types
<whitequark>
which is... remember our toy language yesterday?
<alexgordon>
*affine trololol
<purr>
trololol
<ELLIOTTCABLE>
whitequark: can't, *way* too much school stuff to do
<whitequark>
so like, if you have linear types, you can destruct a thing *once*.
<whitequark>
not zero times, not twice.
<ELLIOTTCABLE>
like, if this is what my first week is like, it's obvious that I'm going to be completely swamped for the entire semester ;_
<whitequark>
naturally, this makes for an unusable language
<ELLIOTTCABLE>
CoC work, and Paws in general, is going to have to go completely on hold.
<ELLIOTTCABLE>
I might even part this room, because as much as I love it, it *is* a time-suck. ;_;
<whitequark>
and regions are a huge, nasty, complex thing required for you to be able to actually write useful code with linear types.
<whitequark>
they drag in half of ML type system, too
<ELLIOTTCABLE>
class NilClass; def method_missing(*); self; end; end lol brixen
<purr>
lol
<ELLIOTTCABLE>
I remember doing Basically This years ago in my core overrides for a bunch of projects
<whitequark>
it's a blight in objective c code
<whitequark>
stuff just randomly stops working and you have no idea why
<alexgordon>
nil?
<alexgordon>
but that is because objc doesn't have exceptions
<whitequark>
(objc includes the definition above by default)
<whitequark>
hrm
<alexgordon>
or rather, it does but they leak memory
<alexgordon>
like in python when there's a key missing you get KeyError
<alexgordon>
but in objc, you get nil
<whitequark>
uh, yeah, sure?
<whitequark>
just don't make sends to nil a no-op
<alexgordon>
which means nil is _everywhere_ in objc
<alexgordon>
moreso than other languages
<alexgordon>
any time anything can fail you get nil, so there's just more nils in general to cause problems
<alexgordon>
whitequark: also objc somewhat redeems itself because you can't add nil to a collection, that fails right away
<whitequark>
hrm
<alexgordon>
which means you don't so often end up with a nil and have no idea where it came from. it's more likely to be from the local area
<alexgordon>
most of the problems I have with nil are to do with interface builder, where you forget to set up an outlet, call [self.window makeKeyAndOrderFront:nil] and it doesn't show up
<purr>
<Nuck> I assume I'm not alone in envisioning elliottcable munching on the head of a parakeet?
* pikajude
beats node.js to death with a giant stick