Friday, March 13, 2026

SSH Local Port Forwarding - Accessing LAN Nextcloud Without VPN Connection

I've been using Nextcloud on my LAN.  It is provisioned to only be used on the LAN.

For the first time in a long while, I'm away visiting relatives.  I wanted to access the Nextcloud app, but since it's only accessible from the LAN, I thought I'd not be able to access it from another state without jumping through some hoops.  I was wrong.

Now, I've done this before but it's been like 15 years and I was initially super rusty with this:  I wanted to try to access the Nextcloud console by establishing an SSH tunnel.

I've one machine that has port 22 exposed to the internet (using SSH key authentication - yeah, I'm not totally aloof).  That machine is a Mac Mini - it is functioning as an SSH jumphost.  The Mac Mini can talk to the machine that is hosting a Docker instance of Nextcloud.  The Nexcloud console is mapped to port 1234 and is accessible using HTTP.

How did I establish a connection?

The public address to my LAN is 203.0.113.25.  The Mac Mini's IP is 192.168.1.200.  The Nextcloud IP is 192.168.1.22 (listening on port 1234).  

All of the IP info is fictional for this exercise.   To get this to work with your systems, change the IPs to match the hosts of your systems.

I ran the following:

ssh -L 1234:192.168.1.22:1234 ron@203.0.113.25

The above runs in the foreground (it establishes an active shell connection)

ssh -f -N -L 1234:192.168.1.22:1234 ron@203.0.113.25

The above runs in the background (it prompts you for login creds or key authentication, and nothing else)

Then, open a browser and type:  http://localhost:1234

Using the above-mentioned steps, I was able to access the Nexcloud console using a browser client.  Not only that, Nexcloud has an agent client.  I pointed that to http://localhost:1234 and it connected!

To better script this process, you can also add the following to your SSH config:

Host home

    HostName YOUR_PUBLIC_IP

    User username

    LocalForward 1234 192.168.1.22:1234

And then run the following command:

ssh home

Needless to say, if you've SSH exposed to the internet, you should use key-based authentication and disable password authentication.  As well, I recommend some type of rate-limiting, as you're going to see a crapload of bots attempting brute force authentication against your exposed SSH port (I use fail2ban).  Using a non-standard port to serve SSH connections is also an option, as most bots tend to only look for port 22 (note that this is considered to be obscuring, which is not really making anything secure).

This example is also for my Nextcloud setup but can be used for anything.  For example, I can use it to access my Portainer console on my Docker host.  I can use it to access other hosts on the LAN besides the Docker host system, as well.  Anything goes.

This isn't really anything super revealing...folks have been doing this for years in corporate IT and home.  I just thought it would be cool to share something that could help some folks that have never done such a thing.  Have fun with it!

Wednesday, February 18, 2026

I Probably Shouldn't Have Bought The AirPods Pro Max

I'm starting to hate my Airpods Pro Max headset.

Most of the time, when I need it, I can't get a good connection.  It's buggy as hell.  I'm so tired of having to reset the headset, only for it to still not get a connection.  I shouldn't have to remove the ear cups from the head harness just to clean the connections EVERY TIME I NEED THEM.

And when I do get a connection, it's like it's a half connection.  When this happens, I can't enable ANC or adjust the volume.

And the ear cushions are a pain in the ass, too.  They need to be cleaned.  A LOT.  If you don't clean them, they start smelling funky.  In fact, I bought a new set, as the smell wouldn't come out of the original cushions.

I've also been seeing a lot of moisture inside the ear cups, under the ear cushions.  I'm not sure if that's causing the connectivity issues, but I've removed the cushions and wiped the insides with a Kleenex and the internals were moist enough to where the Kleenex was soaked.

I'm on the edge of deciding to ask for warranty support (I've not owned them a year yet, and I've AC+).

After they fix it, I think I'm going to sell them and get something nice but non-Apple.

It's a pity, because, when they work, they're outstanding.  WHEN THEY WORK.  They're more broken than used.  So many folks complain on subreddits about the exact same issues.

This is disappointing, because prior to this, I'd bought a set of Beats Studio Pros...they had durability issues - they just started falling apart, but at least they never failed to work.  I still have them, too...last I checked, they were still working.

If you're thinking of buying a Max headset, DON'T DO IT. 

Friday, February 06, 2026

Added Cockpit to Ubuntu 25.10 on the Pi 500+

Today, I wanted to add Cockpit to the Pi 500+ Ubuntu install, but I didn't want to sit at the desk where I'd placed the keyboard.

When I tried to ssh into the Pi, I kept getting connection refusals, which I thought was odd.  I ended up having to spend some time at the Pi keyboard, investigating why I couldn't connect to it on port 22.

I found out that I'd never installed ssh!  I could've sworn I did, but maybe it was the Pi OS install that I installed it.

So, after I installed it, I installed Cockpit (I wanted to try it instead of using WebMin).  I then found that, after the install, when logged into Cockpit, it was only allowing my user limited administrator access.  It gave the option to gain full admin privileges, but when I clicked it, it gave an error that sudo couldn't be leveraged to escalate privileges.  When I googled that error, I found that one of the suggested fixes was to add your user to the wheel group.  My Pi didn't have a wheel group, so I had to create one.  Once I created the wheel group, I had to add my user to that group.  

I then double-checked my research and found a link to a Cockpit bug report of this exact issue back in October 2025.  The issue was that sudo was recently redeveloped in Rust code, and apparently does not support the --askpass flag, which is used by Cockpit.  The fix is to run the following (it's the non-Rust sudo implementation, which is still available):  

# update-alternatives --set sudo /usr/bin/sudo.ws

Now, I've Ubuntu all over the house on various machines, most of them running Cockpit.  I've not seen this issue before, and I've done a bunch of recent installs of Ubuntu 25.10.  In fact, I installed Cockpit on my docker container host, today.  It didn't exhibit this issue/bug.  I'm wondering why I'm only now seeing it.  I'm glad there's a workaround, though.


Saturday, January 24, 2026

Google Search Console - Requirements Overkill!

I've always had a not-so-good experience with Google's Search Console.  

I get it - they're trying to ensure that web content is meaningful.  

I get it, but damn, every single page object appears to have arcane criteria, otherwise Google will not crawl the page.

This is problematic for me, because I don't want to end up being a slave to the process of having Google process my website because, in my opinion, they go overboard with things. 

For example, there's a 40% chance that I'll include an embedded video when I submit a new post to my Wordpress-powered web page.  I've always wondered why my videos aren't being indexed by Google.  I'm now discovering that the videos won't be processed if they don't reside on a "watch page".  A blog post that reviews an embedded video will not be indexed because the video is complementary to the rest of the content on the page.  This means that I've to create a dedicate video landing page.  WTF.

That's one of many examples.  I've a large batch of pages that aren't being indexed because of the ridiculous criteria that Google requires.

The act of creating a video landing page within Wordpress isn't difficult.  Me having to walk backward to obtain all the posts that contain embedded videos so that I can add them to a watch page -- that's a lot of work. And then what?  Do I have to make the prior posts with embedded videos link to the newly built landing page's videos?

Bureaucracy overkill, that's what this is.

What I'll do is create the new watch page and add a few videos a day.  Maybe I'll be done in 6 months?  I'm certainly not going to let this overburden me. 

Sunday, January 18, 2026

I Just Set Up a Web Server to Use an SSL Cert, Using Let's Encrypt!

Yesterday, I was bored and had been contemplating setting up one of my public web sites to use an SSL certificate.

While most business websites use SSL certificates, SSL certs aren't really mandatory for use in just serving web content for reading purposes.  I've been using Apache to serve web pages a LONG time and never felt the need to enable HTTPS, as it wasn't required.  That changed when I found that I wanted my website to be more noticeable within search engine results.  To place higher within search engine results, HTTPS is required to be used on the web server that is serving the content.

As I host my own server, my options were to set up my own SSL certificate or to buy an SSL certificate for use with my server.  I decided to set up and deploy my own.

I used this link's instructions (I used CertBot, which uses Let's Encrypt, which I'll reference as LE) to set everything up.  Keep in mind that I'm using Ubuntu 25.10 to host my server, using Linode as a server.

After I built the certificates, I had a difficult time determining how to leverage them.  I initially tried using a WordPress plugin to import the certificate, but I tried like 5 different plugins and neither worked.  I then pivoted and tried a different method - I'm running Apache to serve Wordpress, so I set up the Apache config file to use HTTPS and pointed Apache to the LE certs.  I then used an SSL checker to check that everything was working.  It was.

Afterward, I then set up a cron job to renew the certs automatically.

Now, when I check the browser for indications that the website is using SSL, there's no lock icon that I can see, but I researched and saw that I also had to ensure the website's prior content wasn't using HTTP links to intneral server content, so I used some Wordpress tools to search and change HTTP links pointing to my web server to use HTTPS.  I also saw that a lot of my plugins and themes are using HTTP links that that's supposed to be a no-no for HTTPS compliance - I can't control how plugin providers construct their plugins, so I'm not sure what to do with that.

I think I'm going to enable SSL with with my other domain, as well (unixfool.us).

Eventually, I plan to replace my Wordpress website with a docker instance.  I'd need to research how to use SSL certs within a docker compose YML file.  I'm thinking it should be pretty straight-forward.  The only thing I can think of that might be an issue is the automatic renewal bit (the bit where I added a cron job to renew the certificate).

UPDATE (1/20/2026):  I just checked again and I can now see that the web page (https://wigglit.com) is showing as secure!  

Friday, January 02, 2026

Rasberry Pi OS, Begone!

Last night, I tried to use a docker container on the new Pi system that I've been able to use on other systems without issue.

This experience was pretty much a nightmare.

I was able to install Docker without issue and the 'hello world' container worked fine.

When I tried to run a Wordpress container, there were cascading issues.  Granted, I know that the Pi runs on the ARM chipset, so I did have to make adjustments for that, which wasn't all that difficult.

The main issue I had related to the Pi OS misconfiguring things.  There were things being blocked by the OS due to bad routing.

While I was able to get the Worpdress container to run, I couldn't connect to it initially.  In fact, I couldn't reach the internet, using curl or any other browser client.  Apparently, curl is kinda weird on the Pi OS, as it requires usage of port 80 and I'd tried to use port 80 as the Wordpress service port.  Since I wasn't using port 80 or any other service that was configured to use port 80, I initially felt it was safe to use port 80 for the Wordpress container.  NOPE!  When I did, it broke some things relating to curl and routing.  After ChatGPT informed me that it's best to not use port 80 for the Wordpress container, I changed the port to 8888 with no success.   It ended up taking me like 6 hours to determine the issue.  ChatGPT kept repeating repair steps that weren't working, until I forced it to look for other issues.

At 4 AM this morning, I finally was able to reach the container using curl, Chromium, and Firefox, but was still experiencing connection drops when trying to use Duckduckgo.  I also noticed several other connection drops (some Wordpress plugins requires backend callbacks to 'home' using curl - those started breaking again.

The fix was to remove some default routes that were associated with the containers.  I also had to remove some rules from IPTables.  I also had to remove some IP links, and also had to add additional config context to the wp.config.php and compose.yml files.

Later in the day, I checked the container again and noticed that the problem routes that I'd removed had been re-added by Pi OS, reverting my work.

I got fed up and decided to start from scratch with another OS.  

I chose Ubuntu, since I'm already familiar with it.  The only wildcard is that this Pi system is still powered by ARM, so I might still run into some things that are currently unknown to me...I'll just have to be prepared for any chipset-related issues that may occur, but I trust Ubuntu more than Pi OS at this point.

Ubuntu 25.10 is now installed on the system's internal SSD.  I used the Pi boot options to reinstall the OS...that's a cool option, but I wish it would also give the option to use wireless connections instead of ethernet, as I had to jump through hoops to ensure I could use the ethernet where the Pi system is currently located.

As well, I wasn't prepared for the new OS install to take 45 minutes.

As frustrated as I was, it's all a learning experience for me.  As well, there's less frustration in reinstalling when I'm using a Pi.

I'd post some of my ChatGPT session, but it was messy and an hours-long chat.

I'll keep you all updated on my progress with Ubuntu on the Pi.

UPDATE (1/2/2026):  Yeah, I already deployed a docker instance of Wordpress in Ubuntu, on the Pi.  I had none of the issues I had last night with deploying the same .yml file on the Pi OS, beyond another issue with changing code so that the images being pulled supported ARM.  The two install experiences were very different.

UPDATE (1/3/2026):  I've still not noticed any issues.  All is well, I think!

UPDATE (1/10/2026):  One issue I've noticed - I've lost sound (thru HDMI connection).  I'm not able to hear system sounds or anything like Youtube audio or music streaming using Audacious.  I've been working on getting it to work but have yet to see success.  With Audacious, I can actually see the music playing, but can't hear it at all (the audio devices are showing as up).  UPDATE to this update (3/11/2026):  The sound issue is related to the monitors;  I've been trying different monitors and some have speakers while others do not.

Thursday, December 25, 2025

Christmas Day 2025 - Received Rasberry Pi 500+ As A Gift!

Good day, all!

My family opened Christmas presents last night at 12 AM (most of us didn't want to wake up early to open presents).

I received a Rasberry Pi 500+ for Christmas - it was a gift from my daughter.



What spurred this interest?  I mean, I was never really curious about Rasberry Pi devices, as I've always had very robust computer systems in my household.  Only now am I hating all the systems sprawled about within my basement.  

My son received a Rasberry Pi 5 from my daughter in November and I got to see it (I'd never held or seen them prior to that).  He was able to configure it as a media server pretty much immediately after he got it.  I loved the small form factor, which spurred my interest.

Once I saw his, I went to Rasberry Pi to see the things they had.  I was immediately curious about the 500+ and had planned to get it on my own, but as my family uses Elfster.com to gift each other, I added it to my wish list.

The keyboard is NICE!  I love how it soft-clicks (it's a mechanical keyboard)...I've a Royal Kludge S98 and that thing is noisy AF compared to this.  I also love the 500+'s RGB setup of preconfigured keyboard configs.

This Pi seems to be powerful enough to where I'm considering using it as my main docker host, but my current docker host is an Alienware M17X R3, which I do not think the 500+ can match across the performance spectrum, but it is a great second choice.  The thing about the Alienware is that it is a laptop and has a functional battery, so if power hiccups or if I lose power, I can gracefully shut down that system.  Plus, that system is quite antiquated for a gaming system, so hosting docker containers is a good use for it.  

Where does this leave me with the 500+?  It means that I can shut down one of my older and less capable systems, which will declutter my office/lab.  

I can actually envision buying several of these to replace old systems.

Everything resides within the keyboard, which is why it is thick.  Actually, the SoC is small AF, though, so I'm not sure why the keyboard is so thick.  The system has a heatsink to dissipate heat - there are no fans, so the system stays quiet.

As this system comes with a 256 GB SSD drive, I did not have to muck with micro SD cards, although I've the option if I feel the need.  The SSD drive is preconfigured with the Rasberry Pi OS.  The SSD drive can also be replaced with something bigger - SSD drive replacements would need to use the M.2 NVMe format.

The system is BT- and Wifi-capable.  It has two mini-HDMI ports, and three USB-A ports (2 x v3 and 1 x v2).  It also has an ethernet port and 16 GB of memory.

I'll be sure to share my Pi journey here.