Baked commenting dev log n°1

Dev Diary

After posting some initial thoughts about baked commenting yesterday, I spent some time making sure the data structure would work. After that, I started on development.

Status

I’m at a point where a comment can be POSTed from the version of this site I have running locally. That comment will then be parsed into JSON by the server, all of the expected metadata will be added to it, and it will be written to the site’s data folder.

Up next

As things stand, a post can only have 1 comment. That comment will be overwritten when a new one is submitted. If the post already has a comments file, I need to parse it and add the new comment to the existing data rather than overwriting everything.

After that, I’ll start on spam filtering. The original plan was to use Akismet, but to avoid any external dependencies I’m going to write a Rust implementation of the “Snook Algorithm” instead. If it turns out to be ineffective, I’ll look at other options. ‘Why wasn’t that the original plan’ you ask? Well, I only found out about the “Snook Algorithm” this afternoon. Not using Akismet also means that users will have one less thing to do in order to get this running on their own server.

More notes

  • The data structure does indeed work: I got dummy comments to render locally for this site after a fair amount of faffing around with Hugo. Not to worry though, the code needed to actually render the comments turned out be be fairly simple.

  • I didn’t mention sanitization in my previous post. Anything that can’t be written in plain Markdown will be stripped though: things like script tags for example.

  • Speaking of, users will be able to write the body of a comment in Markdown. The server will parse it into HTML for you so there’ll be no extra work needed in the template files.

  • I spotted that Hugo has a hugo --watch command which will rebuild a site when anything in its directory changes (without starting a server). A much nicer solution than the initial idea of running a cron job every minute.

  • I forgot two fields in my initial data structure: website and email. They’ve both been added.

  • A long time ago a fake data generator spat out “Magnificent Walrus” as, I seem to remember, a company name. The name stuck instantly. Not knowing what to call this project, I think it’s finally time to put it to use. This project, then, will henceforth be known as “Magnificent Walrus”.

Thanks for reading! If you liked this post, you may like others archived in: Dev Diary.