Advent of F#
For advent of F# this year I submitted an essay about parsing and lexing with the language. It really cemented my love for Leaflet and the community of folks I've found on bluesky. I'm really grateful for all the kindness I've gotten and everything I've learned. I'm excited to continue working and sharing my projects with everyone.
KDL
I like KDL and was first exposed to it when using Zellij last year. I mean look at at its logo! It's like those UwU style logos you find with the ?uwu= param.
I'd expected to leave my parser library as is for a few days but I kept working on it because it felt...incomplete. It doesn't feel incomplete now but now part of me wants to have this on Nuget. There's a great KDL library on Nuget for C# called KDLSharp. I'd like to make mine an F# version with functional patterns.
Anyways, today I implemented more CLI features and way to serialize & deserialize records.
open KDLFSharp.Core
open KDLFSharp.Core.Serialize
open KDLFSharp.Core.Deserialize
type Person =
{ Name: string
Age: int
Height: float
IsActive: bool }
let person =
{ Name = "Alice"
Age = 30
Height = 5.6
IsActive = true }
let repr =
person
|> SerializeConfig.Default
|> Serialize.toString
match repr with
| Ok kdl -> printfn "%s" kdl
| Error e -> printfn "Error: %O" eYou can also convert to and from KDL, XML, and JSON through the CLI. The repo has a couple of examples in the data directory pulled from the main example (austin.kdl) of park & library data in Austin.
To round out the "release candidate" I think I'm going to implement a Document object model, querying, and schema validation. I've never read through API footprints in the F# & greater .NET ecosystem (though the breadth of some of them is impressive like Avalonia.FuncUI).
Baseball API
Today I'm continuing to pre-aggregate leaderboards and sure up deployment.
I added a routes command to walk the codebase's AST and parse calls to HandleFunc to create a list of routes. Inspired by rake routes in Rails and some deployment utilities.
Deployment
Basically this involved peeking at the state of the database and invoking docker commands. I plan to host the image on dockerhub since the project is entirely open source. This is what the deployment output looks like! It's all behind the CLI.
Testing
The way I've handled testing for this project is not how I usually approach development. Documentation and tests are often high priorities for me to help me through architecting systems and in this project I tested primarily with curl.
To remedy this I ended up making a test chassis with testcontainers and a toy dataset of Dodgers, Mariners, and Yankees data from 2023.
Honestly I don't know why I put this off for so long. The whole process is pretty straightforward. Build CSVs by querying the database, then use that to seed the test database after you run migrations. Tests just validate test responses. Auth may be a little complicated but similarly doable.
Tomorrow I'll deploy and add more tests. I'll have to think through benchmarking because I like that package testing provides utilities for them.
Thanks for reading! Any thoughts? Questions? Feel free to reach out here or DM/tag me on bluesky @desertthunder.dev.