Friday, February 06, 2026

Added Cockpit to Ubuntu 25.10 on the Pi 500+

Today, I wanted to add Cockpit to the Pi 500+ Ubuntu install, but I didn't want to sit at the desk where I'd placed the keyboard.

When I tried to ssh into the Pi, I kept getting connection refusals, which I thought was odd.  I ended up having to spend some time at the Pi keyboard, investigating why I couldn't connect to it on port 22.

I found out that I'd never installed ssh!  I could've sworn I did, but maybe it was the Pi OS install that I installed it.

So, after I installed it, I installed Cockpit (I wanted to try it instead of using WebMin).  I then found that, after the install, when logged into Cockpit, it was only allowing my user limited administrator access.  It gave the option to gain full admin privileges, but when I clicked it, it gave an error that sudo couldn't be leveraged to escalate privileges.  When I googled that error, I found that one of the suggested fixes was to add your user to the wheel group.  My Pi didn't have a wheel group, so I had to create one.  Once I created the wheel group, I had to add my user to that group.  

I then double-checked my research and found a link to a Cockpit bug report of this exact issue back in October 2025.  The issue was that sudo was recently redeveloped in Rust code, and apparently does not support the --askpass flag, which is used by Cockpit.  The fix is to run the following (it's the non-Rust sudo implementation, which is still availabe:  

# update-alternatives --set sudo /usr/bin/sudo.ws

Now, I've Ubuntu all over the house on various machines, most of them running Cockpit.  I've not seen this issue before, and I've done a bunch of recent installs of Ubuntu 25.10.  In fact, I installed Cockpit on my docker container host, today.  It didn't exhibit this issue/bug.  I'm wondering why I'm only now seeing it.  I'm glad there's a workaround, though.


Saturday, January 24, 2026

Google Search Console - Requirements Overkill!

I've always had a not-so-good experience with Google's Search Console.  

I get it - they're trying to ensure that web content is meaningful.  

I get it, but damn, every single page object appears to have arcane criteria, otherwise Google will not crawl the page.

This is problematic for me, because I don't want to end up being a slave to the process of having Google process my website because, in my opinion, they go overboard with things. 

For example, there's a 40% chance that I'll include an embedded video when I submit a new post to my Wordpress-powered web page.  I've always wondered why my videos aren't being indexed by Google.  I'm now discovering that the videos won't be processed if they don't reside on a "watch page".  A blog post that reviews an embedded video will not be indexed because the video is complementary to the rest of the content on the page.  This means that I've to create a dedicate video landing page.  WTF.

That's one of many examples.  I've a large batch of pages that aren't being indexed because of the ridiculous criteria that Google requires.

The act of creating a video landing page within Wordpress isn't difficult.  Me having to walk backward to obtain all the posts that contain embedded videos so that I can add them to a watch page -- that's a lot of work. And then what?  Do I have to make the prior posts with embedded videos link to the newly built landing page's videos?

Bureaucracy overkill, that's what this is.

What I'll do is create the new watch page and add a few videos a day.  Maybe I'll be done in 6 months?  I'm certainly not going to let this overburden me. 

Sunday, January 18, 2026

I Just Set Up a Web Server to Use an SSL Cert, Using Let's Encrypt!

Yesterday, I was bored and had been contemplating setting up one of my public web sites to use an SSL certificate.

While most business websites use SSL certificates, SSL certs aren't really mandatory for use in just serving web content for reading purposes.  I've been using Apache to serve web pages a LONG time and never felt the need to enable HTTPS, as it wasn't required.  That changed when I found that I wanted my website to be more noticeable within search engine results.  To place higher within search engine results, HTTPS is required to be used on the web server that is serving the content.

As I host my own server, my options were to set up my own SSL certificate or to buy an SSL certificate for use with my server.  I decided to set up and deploy my own.

I used this link's instructions (I used CertBot, which uses Let's Encrypt, which I'll reference as LE) to set everything up.  Keep in mind that I"m using Ubuntu 25.10 to host my server, using Linode as a server.

After I built the certificates, I had a difficult time determining how to leverage them.  I initially tried using a WordPress plugin to import the certificate, but I tried like 5 different plugins and neither worked.  I then pivoted and tried a different method - I'm running Apache to serve Wordpress, so I set up the Apache config file to use HTTPS and pointed Apache to the LE certs.  I then used an SSL checker to check that everything was working.  It was.

Afterward, I then set up a cron job to renew the certs automatically.

Now, I when I check the browser for indications that the website is using SSL, there's no lock icon that I can see, but I researched and saw that I also had to ensure the website's prior content wasn't using HTTP links to intneral server content, so I used some Wordpress tools to search and change HTTP links pointing to my web server to use HTTPS.  I also saw that a lot of my plugins and themes are using HTTP links that that's supposed to be a no-no for HTTPS compliance - I can't control how plugin providers construct their plugins, so I'm not sure what to do with that.

I think I'm going to do with with my other domain, as well (unixfool.us).

Eventually, I plan to replace my Wordpress website with a docker instance.  I'd need to research how to use SSL certs within a docker compose YML file.  I'm thinking it should be pretty straight-forward.  The only thing I can think of that might be an issue is the automatic renewal bit (the bit where I added a cron job to renew the certificate).

UPDATE (1/20/2026):  I just checked again and I can now see that the web page (https://wigglit.com) is showing as secure!  

Friday, January 02, 2026

Rasberry Pi OS, Begone!

Last night, I tried to use a docker container on the new Pi system that I've been able to use on other systems without issue.

This experience was pretty much a nightmare.

I was able to install Docker without issue and the 'hello world' container worked fine.

When I tried to run a Wordpress container, there were cascading issues.  Granted, I know that the Pi runs on the ARM chipset, so I did have to make adjustments for that, which wasn't all that difficult.

The main issue I had related to the Pi OS misconfiguring things.  There were things being blocked by the OS due to bad routing.

While I was able to get the Worpdress container to run, I couldn't connect to it intially.  In fact, I couldn't reach the internet, using curl or any other browser client.  Apparently, curl is kinda weird on the Pi OS, as it requires usage of port 80 and I'd tried to use port 80 as the Wordpress service port.  Since I wasn't using port 80 or any other service that was configured to use port 80, I initially felt it was safe to use port 80 for the Wordpress container.  NOPE!  When I did, it broke some things relating to curl and routing.  After ChatGPT informed me that it's best to not use port 80 for the Wordpress container, I changed the port to 8888 with no success.   It ended up taking me like 6 hours to determine the issue.  ChatGPT kept repeating repair steps that weren't working, until I forced it to look for other issues.

At 4 AM this morning, I finally was able to reach the container using curl, Chromium, and Firefox, but was still experiencing connection drops when trying to use Duckduckgo.  I also noticed several other connection drops (some Wordpress plugins requires backend callbacks to 'home' using curl - those started breaking again.

The fix was to remove some default routes that were associated with the containers.  I also had to remove some rules from IPTables.  I also had to remove some IP links, and also had to add additional config context to the wp.config.php and compose.yml files.

Later in the day, I checked the container again and noticed that the problem routes that I'd removed had been readded by Pi OS, reverting my work.

I got fed up and decided to start from scratch with another OS.  

I chose Ubuntu, since I'm already familiar with it.  The only wildcard is that this Pi system is still powered by ARM, so I might still run into some things that are currently unknown to me...I'll just have to be prepared for any chipset-related issues that may occur, but I trust Ubuntu more than Pi OS at this point.

Ubuntu 25.10 is now installed on the system's internal SSD.  I used the Pi boot options to reinstall the OS...that's a cool option, but I wish it would also give the option to use wireless connections instead of ethernet, as I had to jump through hoops to ensure I could use the ethernet where the Pi system is currently located.

As well, I wasn't prepared for the new OS install to take 45 minutes.

As frustrated as I was, it's all a learning experience for me.  As well, there's less frustration in reinstalling when I'm using a Pi.

I'd post some of my ChatGPT session, but it was messy and an hours-long chat.

I'll keep you all updated on my progress with Ubuntu on the Pi.

UPDATE (1/2/2026):  Yeah, I already deployed a docker instance of Wordpress in Ubuntu, on the Pi.  I had none of the issues I had last night with deploying the same .yml file on the Pi OS, beyond another issue with changing code so that the images being pulled supported ARM.  The two experiences were very different.

UPDATE (1/3/2026):  I've still not noticed any issues.  All is well, I think!

UPDATE (1/10/2026):  One issue I've noticed - I've lost sound (thru HDMI connection).  I'm not able to hear system sounds or anything like Youtube audio or music streaming using Audacious.  I've been working on getting it to work but have yet to see success.  With Audacious, I can actually see the music playing, but can't hear it at all (the audio devices are showing as up).