Getting started with iocaine is now online.

From nothing to running iocaine + Caddy with ai.robots.txt's robots.json and a few metrics as a starting point.

Contains #Roto, #Lua, and #Fennel - and a few tests too, for each.

I'm currently writing #Fennel examples, and this is delightful. I even managed to make the decide function even more beautiful than it was!

(local ruleset [is-in-ai-robots-txt?
default])

(fn decide [request]
(accumulate [outcome nil
_ f (ipairs ruleset)
&until (not= outcome nil)]
(f request)))

This is perfection.

I'm on a bit of a roll lately, and have released #iocaine version 2.4.0 just a few moments ago.

This does not bring that many significant changes as 2.3.0 did, but it does introduce #Lua and #Fennel as languages you can script its decision making with, on top of #Roto, which was introduced in 2.2.0.

While these languages run slower than Roto, they're still very fast, and are not going to be a bottleneck - they do provide a more familiar language to write the decision making in!

Oh, and metrics can now be persisted across restarts.

And because I give funny names to my structs, the one dealing with #Fennel is called ElegantWeapons.

In other #iocaine news, I'm doing some final polishing on #Lua scripting support, to make it as convenient as #Roto.

Right now, there's a differenc between how Lua and Roto scripts are loaded: with Roto, one needs to give a path to a directory, and pkg.roto will be loaded from there, and any imports will be relative to that directory.

With Lua, one gives iocaine a file path, and - currently - needs to set up the package.path search path manually.

So here's what I'll do: I'll make iocaine require a directory for Lua too, and it will add it to package.path, and will require("main"). The required module will also have to return a table with at least a decide key, and an optional run_tests key. This will simplify finding the functions to run, and will greatly reduce the number of special cases.

After that, I'll try to figure something about how to support #Fennel better, natively. The current idea is to have it exposed as another scripting language, built as a wrapper around the Lua engine, with some additional setup at init time:

  • Load the fennel compiler from somewhere
  • Update fennel.path
  • Run fennel.install()

I could embed the fennel compiler, it is small enough, and releases aren't frequent. So that's an option. I don't want to pull it into the sources, though, so I'd need a way to do grab it during build or something. That's a bit problematic, however.

So the next best thing I could come up with is:

[server.request-handler]
path = "/path/to/some/fennel/code"
language = fennel
options = {
compiler = "/path/to/fennel.lua"
}