alias yaml2json='python3 -c '\''import sys, yaml, json; y=yaml.safe_load(sys.stdin.read()); print(json.dumps(y))'\'''
Because we also started using it in the test harness to validate YAML documents, we made it available to that environment as well. Test creates some YAML report, which is piped into yaml2json, which is piped into
jq --exit-status '[...tests here...]'.
Fast forward a few months, more tests got added, and eventually I notice that one of these, on a 30k line document, spends about 15 seconds just converting it. Hm.
Now, it's not like there are not already tons of C, C++, Go, Rust, etc. versions of that tool around, I'm aware of many of those, unfortunately none are prepackaged for the distro used around here. Still, it nags me. I know that in Rust we have all these convenient Serde libraries. Would be trivial to write it using these. People have done so. Hm. And we could make a little wrapper around jq, like yq (python), which makes it easier to use. And we could also support other formats, like XML or CSV, like yq (Go) - the latter also replicates jq itself, that seems a step too far and risks running into incompatibilities, jq itself does the job alright.
So, eventually I did put these puzzle pieces together and we have yet-another-format-to-JSON-converter: https://github.com/simonrupf/convert2json
Unique selling points? Well I've experimented a bit with github actions and it is indeed not to difficult to automate the release and publication of statically linked RPMs (for use on different RPM-based distros) and dynamically linked binaries for Ubuntu, MacOS (universal binaries for x86_64 & aarch64) and even Windows (untested - I have no access to that OS, nor do I want to). I'll try to keep it light weight and have some automation set up to keep it updated when dependencies change.
And yes, that 30k YAML file now parses in 0.2s instead of 13s. OCD satisfied.