This article has been on the backlog for quite some time and I dig it back out since reading about Scribus over on HackerNews. It got me thinking about Iris Messenger again and how I should reboot it.
Back in 2017, from the 16th of April through to the 16th of September, I wrote a series of articles under the name of Iris Messenger. As I mentioned in the first article:
Welcome to the Iris Messenger, a free online newsletter aimed at tackling various pieces of interest regarding security, artificial intelligence, software and the greater world. The pieces themselves are aimed to be short, with a maximum of an A5 page being used allowing those with busy schedules to get a brief overview in short time, as easier to write content.
Essentially, it boiled down to collecting a series of links of interest and writing some small text around them. Te real beauty of it was the formatting and seemingly simple approach.
The name itself was derived from Greek mythology, as the messenger between the Gods (those in control) and humans (those without power). The logo, as seen in [1], represents an eye - symbolic of remaining watchful in a world that rewards ignorance. Releases are planned twice a month, the 1st and 16th of each month, meaning 24 issues a year (with volumes increased per year). This may change over time depending on commitment and popularity - the first few months can be considered a trial period.
Unfortunately, I only made it to 7 articles:
I believe there were a few reasons I only managed to complete this number of articles:
There were some other issues too that I never resolved, such as the URL shortening relying on Google. I thought it could always be done better, and I believe that time is likely now! What’s changed?
I have a few new ideas to bring to the article:
Essentially, this will be just a series of things that I personally find interesting and want to share. Hopefully other people also find it interesting.
One thing I will need to do is figure out how to clearly section these ideas from one another and prevent it appearing as one large opinion piece.
So next a quick discussion about the delivering of the content.
One thing I struggled with previously was the timing of the articles. I think one article every two weeks could in theory still be achievable, but for now I will leave this open-ended. Once a month could also be a good option, meaning 12 issues a year.
I want to deliver the content in several formats, something that was not originally offered.
I will use the standard CoffeeSpace formatting for the articles, but this will likely not be the best way to view the articles. I suspect it will still be the main way in which most articles are consumed.
With the web-based version will be the automatically generated audio as the other articles also have. At some point I will invest more time into making this experience better.
I want to maintain the printable PDF version of the article, I really liked this version. The A5 format I suspect is actually quite a nice format still - not too much information on each page. That would make for three columns. I would also suggest that one column is a magazine, two-columns is an engineering paper, making three columns a more distinct format.
In terms of link shortening, I think I will stop bothering to do this. Using an third-party service will only make the article less robust in the future. What I think I will do is make use of either footnotes or citations. I’ve been a fan of the IEEE citation style for quite some time and may adopt that here.
I will also likely try to archive each of the links in the article with the Internet Archive Wayback Machine. Years ago I also came up with a way of compressing text-based websites into a Base64 Data URI, but I’m not sure this will be useful or portable.
For page length, I think I will aim for a multiple of two pages, two pages being the limitation. This should be relatively easy to achieve with images, larger references and attribution.
I will likely be looking towards pandoc
to automate this entire process.
I want to add more rich content to the articles in the form of images. The aim would be to keep this as lightweight and easily printable in a traditional printer as possible.
I’ve been meaning to explore image dithering for quite some time now as a method for compressing images whilst maintaining understandability 1. Not only is it great for saving on bandwidth, but also it’s great for saving CO2 emissions related to delivering server content 2.
The original title image is 1033187 bytes:
Using the following dithering command we can further reduce the file size:
0001 convert image -colorspace gray +dither -colors 2 -type bilevel result
We can compress this image down further to 57388 bytes (JPG):
And even further down to 4558 bytes (PNG):
This isn’t the first time I’ve worked on this problem either. Applying it to the image in the original image compression article, using the following command:
0002 convert -strip -resize 256x256 gamestop.jpg -monochrome +dither -colors 2 -type bilevel test.png
We compress the image down to 2716 bytes. I believe that it not only looks respectable, but also has some nice print aesthetic to it.
I’ve written the following bash
alias to ~/.bash_aliases
(you’ll need to reload your bash source and possible edit your .bashrc
file to load the aliases if not already set):
0003 alias imagecompress='function _imagecompress(){ target="${1%.*}-cmp.png"; convert -strip $1 -monochrome +dither -colors 2 -type bilevel $target; };_imagecompress'
I can therefore compress an image by just running something like:
0004 $ ls -la gamestop.jpg 0005 28086 gamestop.jpg 0006 $ imagecompress gamestop.jpg 0007 $ ls -la gamestop-cmp.png 0008 13029 gamestop-cmp.png
As you can see, we can see quite a large drop in file size, even without resizing the original files. In the future I will look to somewhat automate the compression of images.
I soon do some travelling, but scripting this could be a fun low-effort side project.
I think that in order for this project to be viable, writing from start to finish should take no longer than 6 hours. I believe generally the following needs to be done:
Watch this space…