xelxebar 2 days ago

Man, I feel like APL has unlocked some latent part of my brain.

I'm a few years into seriously using APL and now work in it professionally doing greenfield development work.

Starting out, solving puzzles and stuff was fun, but trying to write real programs, I hit a huge wall. It took concerted effort, but learning to think with data-first design patterns and laser focusing on human needs broke through that barrier for me.

Writing APL that feels good and is maintainable ends up violating all kinds of cached wisdom amongst developers, so it's really hard to communicate just how brutally simple things can be and how freeing that is.

  • sitkack 6 hours ago

    Ok, you gotta follow through now with the wisdom. Please write it down, we will pay for it.

  • gtani a day ago

    Interesting, how did you choose APL?

    i worked in APL2 fulltime years ago, big asset backed bond models, big as in some of the largest workspaces the IBM support people had ever seen. Never occurred to me to pick it up again, but i have been looking for the Polivka/Pakin book i learned out of (the edition prior to their APL2 edition).

    • xelxebar a day ago

      I came to APL slowly, originally motivated by some combination of fascination with the syntax and desire to break into the financial sector.

      However, what got me to invest in earnest study was hitting today beginner's wall and realizing that I had no idea what Iverson was on about with his design principles.

      APL is really different these days, as far as I hear. Dyalog APL is the only vendor actively working on the language these days, and the old hats tell me that things like dfns, trains, and various operators make modern APL quite different from APL even just 15 years ago.

  • ralegh 2 days ago

    Could you give some examples of where you're using it?

    • xelxebar a day ago

      My YAML loader[0] is where I first broke through the wall. It's still languishing in a relatively proof-of-concept state but does exhibit the basic design principles.

      There's also a Metamath verifier that does parallel proof verification on the GPU. It's unpublished right now because the whole thing is just a handful of handwritten code in my notebook at the moment. Hoping to get this out this month, actually.

      A DOOM port is bouncing around in my notes as well as a way to explore asynchronous APL.

      I'm also helping Aaron Hsu in his APL compiler[1] for stuff adjacent to my professional work, which I can't comment on much, unfortunately.

      Et hoc genus omne

      [0]:https://github.com/xelxebar/dayaml

      [1]:https://github.com/Co-dfns/Co-dfns

  • ogogmad a day ago

    I'm thinking I'd like to learn array languages (APL, J) and maybe use them professionally. Maybe their time has come.

noosphr 2 days ago

Missing the tag (1970), and the paper text.

3836293648 2 days ago

It's one of those broken sites where you can't even access the text. And I am signed in, it just doesn't load the pdf.

boznz 2 days ago

Cant access the text but "sounds" very advanced for 1970. Gemini 2.5 did not give me anything much about it so a little perplexed about its relevance.

  • polytely 2 days ago

    you can't imagine something being relevant because the AI doesn't know about it? Seems like more a fault of the AI if you ask me. There is a huge amount of information that hasn't been—or cannot—be captured in the data LLMs are trained on.

ogogmad a day ago

How does this compare to a modern GPU?

  • bear8642 a day ago

    Reading the abstract, it seems like a precursor of somekind