I actually briefly mentioned this project when I write about moving from TinyTinyRSS to FreshRSS. This has become a bit of an evolving and ongoing project however, so I’ve decided to catalogue it in it’s own page. This little script worked out much better than I expected, and I’ve modified it a bit over time, and have ideas to modify it going forward even more. Starting off, the code can be found here in this Github GIST.
I’ve left a bit of commented out code that i might use later for troubleshooting or adding additional features. The general gist of the code, it pulls the last 24 hours worth of news stories I have favorited from my FreshRSS install, then formats them into a digest format and posts it here, in this blog. They get sorted into their own category, you can find them here.
This is basically a thing I’ve seen others do that I’ve wanted to do for a while. It’s also partially just for my reference more than anything, it’s sort of a log of everything I have found interesting on a particular day more than anything. Others may or may not find it interest, which is why I also filter that category out of the home page feed.
Originally, it was just a list of URLs and titles. I realized that it might be useful to have SOME idea what the link was about before clicking it, so I have been playing with the summary as well. My first attempt was a bit dodgy because it actually posted the entire article as the summary. Currently, it just arbitrarily chops it off at a few hundred characters. I want to improve it even farther at some point by pushing it through some summarizing AI and getting an actual proper summary but I have not gotten there yet.
There re a few other things I want to add but I’m not sure they re easily possible. Firstly, I would love to be able to parse some sort of categories into the digest. So say, all the “Video Game” links are together and Music links are together. FreshRSS has categories but they don’t seem to show up in the feed anywhere.
This would also allow me to split these posts between this blog and my other blog, Lameazoid. I do share interesting video games news from FreshRSS, but I mostly don’t share Toy related articles, because it feels a little TOO FAR out there for what I want to post to this blog. If there were a way to have the categories, I could easily have the script split the feed by categories and post a digest to each blog.
I also wish there was a way to add my own notes and commentary occasionally. I don’t think it showed up in the feed either, but TinyTinyRSS had a notes feature. I am not sure if FreshRSS has that as well. I probably should try to at least suggest these features to the creators on GitHub, or maybe get really adventurous and create my own plug-ins for FreshRSS to accomplish these tasks.
Josh Miller aka “Ramen Junkie”. I write about my various hobbies here. Mostly coding, photography, and music. Sometimes I just write about life in general. I also post sometimes about toy collecting and video games at Lameazoid.com.
It’s kind of funny how one post can lead to another sometimes. This one is pretty basic but it also just shows a bit how useful I find knowing my way around computer systems to be. Yesterday I posted about my little annual music playlists. And as part of that, I wanted to actually post the playlist. I am pretty sure there is a fairly universal “playlist file type” out there and being open source, I had assumed that VLC on my phone stored the playlists somewhere in playlist files.
That assumption was wrong, it uses a .db file. A little portable database. There is an option to dump this file to the root of the phone, presumably for backup purposes, but it’s also useful to just browse it like I am doing here. The file itself can be opened and browsed with SQL Lite’s DB manager. It’s standard databases inside for tracks and artists and playlists.
Fortunately, I have had some experience dealing with database queries, so I set about building what was needed tog et the data I wanted. Pull the Playlist I want, in this case “2023 Best” but I could change that to do any available Playlist. This gives the tracks by id, but the tracks themselves are stored in a separate table for media. So that needs joined in. The media table stores track names, but not artist names, so an additional join is needed to get the artist names. This complicated things a bit because both the playlist table and artist table have a column “name” so more clarity needed to be added.
The result was this little query that dumps out a basic table of Artist and Song title.
SELECT Artist.name, Media.title
FROM Playlist
Inner Join playlistmediarelation ON playlist_id=id_playlist
Inner Join Media ON id_media=media_id
Inner Join Artist ON media.artist_id=Artist.id_artist
WHERE Playlist.name = '2023 Best'
ORDER BY Artist.name
Now, I could have done some cute clever trick now to merge the two into a new column and add in a ” – ” between but it was easier to drop it all into a notepad file and do a fine/replace on the weird space character that it stick in between the Artist and track title.
The added bonus here is I can easily use this query again anytime I want to dump a Playlist to text.
Josh Miller aka “Ramen Junkie”. I write about my various hobbies here. Mostly coding, photography, and music. Sometimes I just write about life in general. I also post sometimes about toy collecting and video games at Lameazoid.com.
This is one of those quick and kind of dirty projects I’ve been meaning to do for a while. Basically, I wanted a script that would scrape all of the top level comments from a Reddit post and push them out to a list. Most commonly, to use on /r/AskReddit style threads like, well, for this example, “What is a song from the 90s that young people should listen to.”
Basically, threads that ask for useful opinions on list. Sometimes it’s lists of websites or something. Often it’s music. The script here is made for music but could be adjusted for any thread. Here is the script, I’ll touch on it a bit in more detail after.
## Create an APP for Secrets here:
## https://www.reddit.com/prefs/apps
import praw
## Thread to scrape goes here, replace the one below
url = "https://www.reddit.com/r/Music/comments/10c4ki0/name_one_90s_song_kids_born_after_2000_should_add/"
## Fill in API Information here
reddit = praw.Reddit(
client_id="",
client_secret= "",
user_agent= "script by u/", # Your Username, not really required though
redirect_uri= "http://localhost:8080",
)
submission = reddit.submission(url=url)
submission.comments.replace_more(limit=0)
submission.comment_limit = 1
for x in submission.comments:
with open("output.txt", mode="a", encoding="UTF-8") as file:
if "-" in x.body:
file.write(str(x.body)+"\n")
# print(x.body)
The script uses praw, Python Reddit API Wrapper. A Library made for use in Python and the Reddit API. It requires free keys which can be gotten here: https://www.reddit.com/prefs/apps. Just create an app, the Client ID is a jumble of letters under the name, the secret is labeled. User Agent can be whatever really, but it’s meant to be informative.
The thread URL also needs filled in.
The script then pulls the thread data and pulls the top level comments.
I’m interested in text file lists mostly, though for the sake of music based lists, if I used Spotify, I might combine it with the Spotify Playlist maker from my 100 Days of Python course. Like I said before though, this script is made for pulling music suggestions, with this but of code:
if "-" in x.body:
file.write(str(x.body)+"\n")
# print(x.body)
It’s simple, but if the comment contains a dash, as in “Taylor Swift – Shake it Off” or “ACDC – Back in Black”, it writes it to the file. Otherwise it discards it. There is a chance it means discarding some submissions, but this isn’t precision work so I’m OK with that to filter out the chaff. If I were looking for URLs or something, I might look for “http” in the comment. I could also eliminate the “if” statement and just have it write all the comments to a file.
Josh Miller aka “Ramen Junkie”. I write about my various hobbies here. Mostly coding, photography, and music. Sometimes I just write about life in general. I also post sometimes about toy collecting and video games at Lameazoid.com.
Keen observers (ha ha ha no one reads this), might have noticed that a few posts of links showed up in the feed. These are basically, stories I read in my RSS reader that I found interesting, and wanted to share, or at least, keep track of. The posts as of now are a little ugly, and I’ll probably clean up the formatting over time, but I wanted to go ahead and write a bit about the process. I’ll have the Code on Github at some point.
As for the factors, firstly, this is something I’ve wanted to have on my blog for a while. Like a long while. I might even try to see if there are ways to better slit up the links by topic later. A fair number of blogs I subscribe to have these sort of link digest posts, and I’ve always just liked the idea. It’s also good for personal reference to when I may have read something. It is limited as it only comes from y RSS Reader.
Speaking of my RSS Reader. I’ve moved on from TinyTinyRSS, for a few reasons. One, the interface is a little meh, honestly. Maybe the newer version is better but it’s only available in Docker, and Docker is such a PItA to use. Also, while looking for alternatives, it sounds like the folks who make TTRSS are kind of a bunch of gatekeeping jerk types, and I’d rather not support that. I also find the need to keep the update daemon running with Screen to be a pain. So I’ve moved over to FreshRSS, which I just run locally on a Raspberry Pi. I may move it to a publicly accessibly machine at some point, but I am not entirely convinced that TT-RSS wasn’t the entry point for my previous server malware woes.
So, like TT-RSS, Fresh RSS has a way to get an RSS feed out of your Favorited posts. In the past I’ve used tools like IFTTT to automate posting these links around, but I don’t use IFTTT anymore for reasons I’m not going into. Fortunately, I’ve been working to become a pretty good Python coder for the last month or so. So instead I wrote a script.
It’s not even a particularly complicated script. There are only two things it really needs to do, get new articles, and then post them to WordPress. Since the script runs locally, on the same Raspberry Pi even, it easily can reach and pull the RSS feed. One nice thing I noticed with Fresh RSS, the feed included a time interval, so just getting new posts was super simple, because the interval is just “24” for “24 hours”. The script eventually will run on a cronjob at the exact same time daily. Anyway, after pulling the RSS, the entries are already in an easily usable Dictionary. which gets fed into the construction of the WordPress Post.
The posting part was pretty easy as well, WordPress has an API, and Python also has a library that can use that API. It just needs some log in information and a post payload to send.
def make_post(NewsFeed):
wp = Client(f'https://{wp_url}/xmlrpc.php', wp_user, wp_pass)
post = WordPressPost()
post.title = f"{cur_date} - Link List"
post.terms_names = {'category': ['Link List'], 'post_tag': ['links', 'FreshRSS']}
post.content = f"<p>Blogging Intensifies Link List for {cur_date}</p>"
for each in NewsFeed.entries:
post.content += f'{each.published[5:-15].replace(" ", "-")} - <a href="{each.links[0].href}">{each.title}</a></p>'
The trickiest part was formatting the date a bit prettier. I mentioned cleaning up the formatting a bit, I’m thinking maybe a simple invisible table, so the date and the links don’t wrap oddly like they do now. i also added a check that if there are no new favorited posts, it will skip making a post. Otherwise I’ll end up with empty posts on days I forget to check my feed reader
While writing the script, at first I was just outputting a text copy of the post to the console until satisfied. Eventually, I pushed out a real post, then verified that things worked. The next day, was just a straight test by opening the project, then running it again. The third day, I copied the files and installed the lobraries needed, then posted from the Pi. Phase 4 of this will be to set up Cron to run it automatically. If that works then it will certainly, “just run” for the foreseeable future.
Josh Miller aka “Ramen Junkie”. I write about my various hobbies here. Mostly coding, photography, and music. Sometimes I just write about life in general. I also post sometimes about toy collecting and video games at Lameazoid.com.
So, this isn’t going to really have any code. I might, sometime int he distant future, publish some code, but this whole thing is very much a “add things as I go, ongoing project. The base code itself isn’t particularly complicated though. It’s a pretty simple HTML/PHP/CSS layout that wraps around various modules I’ve been building. I keep mentioning the Dashboard though when talking about the various projects, so I figure I should do a quick run down on what the Dashboard involves.
I’m actually building a more complex iteration of this project at work as well, to be used internally by my work group. The work one is considerably more complex, for example, it has a much more robust Admin area that is growing with features to manage locations, manage users, manage user teams, etc. The base layout framework is shared between the two dashboards, but the work one has a lot more actual functionality. Because I am the only one using the home version, I generally just code everything in, so it’s less modular. I also have to translate any code I write for one or the other version between the different Database back ends. I use MySQL at home, I use MS SQL at work.
At it’s base, it’s just a webpage on my project webserver that displays information. Some of that information is useful, some is just there to fill space and to practice coding something up. i mentioned above, it’s essentially a Header, sidebar, and Footer that wrap around a variable content box. On the home page, the content box contains what I have been calling “Quick Cards” with bits of information, that sometimes link to larger chunks of data. This is what it looks like, at the moment, on the home page.
I’ll dive in a bit on some of the menus and content but I am going to start with the Quick Card boxes, in order.
The Weather box seemed like an obvious choice for at a glance information. I want to make it link to a sub page with more forecast data, but for now, it just displays the current weather conditions for my location. Unfortunately it’s built on the Dark Sky API, which very recently announced is closing down, so I’ll have to find a new API to use.
Next is the COVID-19 stats widget. This is the other side of the COVID-19 Tracking Python Script I posted recent. It just pulls and displays the most recent information that the script has pulled. I may update this to link to a page with some timeline graphs on it, once I figure out how to put a data graph in a webpage.
Network devices is the most robust of all of the modules I’ve built so far. The Quick Card just shows the current number of active devices on the home network. Clicking it opens the Network Device page I talk about here.
Social Accounts is just link list to the various Twitter Accounts I have. I want to change this to be a modifiable list eventually, but for now it’s just a list. It does do a database pull to build the URLs, but I have not added a configuration page yet.
The next box displays how many unread posts are on each of my TT-RSS accounts. After Google killed Google Reader, I set up TinyTiny-RSS on my webserver and started using it for my feeds. I became overwhelmed so I broke all of my feeds into themed sub accounts. I would link to each sub account, but it’s all the same link, just with a different log in, so the links would be useless. Normally, I just use container tags to keep the different log in instances open.
Lastly is a tracker for Reddit Karma for several Reddit Accounts I have. Like my RSS feeds, I have broken my Reddit subs out into seperate themed accounts. i don’t really care that much about Reddit Karma, but I wanted to play around with APIs and JSON, so I figured this would be an easy project. I will probably post the script used in the future, but it’s essentially identical to the recently posted COVID-19 script. In fact the COVID-19 script was adapted from the Reddit Karma script.
Along the top navigation bar are some drop downs with useful links that don’t really have “At a glance” data. The first two, “My Websites” and “My Hosted Apps” are just drop downs with links to the Blogs I manage and my Webhosted apps for Email and TT-RSS.
The next drop down is similar in nature, in that it’s a list of links, but this one has an admin page so I can maintain the list as it changes. It also hasn’t quite found a home yet. I had it in the sidebar for a while, then I had it in a Quick Card, now it’s int he Navigation menu. It’s a list of links to web services on my internal home network. It’s linkes to Routers, Raspberry Pis, IP Cameras, my NAS, and to various things I have set up on my Project Server.
Next to that is the Gas Tracker, which is very much a WIP page. When I bought my car back in 2014, I decided I wanted to track my Gas consumption for the life of the car. Currently this lives in an Excel Spread Sheet on One Drive. I wanted to see about translating it into my own webpage and using SQL as the back end. Currently it just displays a table of data that I imported from Excel. There isn’t any way to add new data yet and it doesn’t calculate the Price/Gallon or total money paid or anything like that.
Lastly is an old project I did called Tweeter that I got up and running again and embedded into the Dashboard. Tweeter itself is fairly self contained and I will probably do a couple of detailed write ups on it in the future and post the code at that time. I also want to update it to use SQL as the back end so I’ll do a second write up when that happens.
A while back, I was looking for a way to automate posting Tweets, mostly so I could share links to articles, but have them space out over the course of a day. I couldn’t find a decent free service, and thus, Tweeter was born. Tweeter is a two part solution, it’s a PHP page that writes a text file, and some scripts (python and bash) that runs on a schedule and posts the contents of that text file to Twitter, one line at a time, on whatever cron schedule is set. I’m not going to go into anymore here, but I promise to post about it int he future. It’s also a little ugly and probably insecure as hell, but it works well.
The main fun of integrating Tweeter, the box is 140 characters wide, the same as a tweet. So I had to modify my core code framework to have a toggle for pages that don’t display the sidebar. It wasn’t anything complicated, but I hadn’t considered that need, and so I fixed it. That’s kind of part of the fun and point of doing these sort of code projects.
Josh Miller aka “Ramen Junkie”. I write about my various hobbies here. Mostly coding, photography, and music. Sometimes I just write about life in general. I also post sometimes about toy collecting and video games at Lameazoid.com.