Tag Personal
Connection a Psion Series 3a to my Mac
I follow James Weiner on Mastodon (@hypertalking@bitbang.social) because of his beautiful one-pixel art and his work on restoring old computers. Last week he boosted a post which mentioned Psion PDA’s which lead me down a rabbit hole which ended at Psion User Group which is maintained by Alex Brown (@thelastpsion@bitbang.social). I remembered still having a Psion Series 3a myself somewhere stowed away in a closet together with a Palm V and a Tungsten T3.
The journey got me thinking if would be possible to use the Psion, which still looks like a great device, in a current setup and have it sync data with my Mac Studio. This send me down another journey searching for accurate documentation but I found that lacking. It was mostly for Windows based computers (32 bit and not working on most 64 bit machines) and for Apple which was a bit more obscure at that time even less. If there was anything for MacOSX it was for Intel based machines and definitely not for Apple Silicon.
Toady I got the connectivity working with the excellent help of Alex Brown and Chris Farrow on the Psion User Discord server. I thought I’d write down the steps how I got it working as a reference for others who might want to follow me down this rabbit hole as well.
First problem is the physical connection, it based on sub D9 connector or more commonly known as a serial port. Your Psion series 3a came with a special cable called the 3 Link which was a serial connection interface.
The problem being that current computer hardware does not have any serial or even parallel connectors that where once ubiquitous. To resolve this I had to buy an USB to Serial interface which was obtained via Amazon. You can use others but they must have the PL2303 chip takes care of the proper communication.
Next you’ll have to install the driver which was to my surprise available in the App Store PL2303 Serial. The next step is optional but I found it very useful to test if there is any connectivity between the Mac and the Psion and the cable is doing it’s thing. You’ll use a Terminal emulator to connect both devices. I’ve used SerialTools on my Mac because it’s free and available in the App Store. On the Psion you’ll need to install the Comm tool, if it not installed then you can use Psion-I, select the C: drive, and install Comms.app. (before you do, make sure you disabled the 3-Link).
Open the Comms tool on your Psion and on the Mac you open Serialtools, select the PL2303 port, set baud rate to 9600 and press connect. What you type on the Mac should appear on the Psion screen and vice versa. This proves a proper connection between your Psion Series 3a and your Mac.
There are several options for the next fase, installing plptools, running a windows VM to use the PsiWin program or use DosBox staging to run Mclink
I have chosen to start with the DosBox option as it was the simplest option to get started. I dabbled with the plptools option but I haven’t got it working yet. So here are my instructions on using the DosBox option.
First you need to download the Mclink program from here, unzip it in a separate directory which you will reference later.
Download the latest version of DosBox staging from their download page and install the program. When installed, first start the application and on the new Z:\>
prompt type the command: config -wc
to create a new config file in ~/Library/Preferences/DOSBox
called dosbox-staging.conf
.You’ll need to edit this to make a link to your new serial connection. Mine was located at /dev/tty.PL2303G-USBtoUART8340
check using the Terminal if your is called the same. Find the line that starts with serial1
and make it look like: serial1 = direct realport:tty.PL2303G-USBtoUART8340
.
At the end of the file you can add commands that can be executed during startup. The command I added is the mounting of the directory where I extracted my copy of Mclink. mount c /directory/location/of/mclink
To use the new configuration file you’ll need to restart the DosBox program. After it’s restarted you can issue the command c:\mclink
when the mclink program has started you should see a screen like:
Next connect your Psion Series 3a using your new USB to serial cable with the 3link cable. Start the 3-Link program by pressing Psion-L (Key combination, bottom left key and the L together) and turn it on at 19200 baud.
If everything is correct you’ll see mclink connecting and three lines should appear at the top of the window like:
As you can see I’ve asked for the directory contents of the ramdisk of the Psion using the command: dir rem::m:\*
for more commands on how to exchange information please locate the MCLINK.DOC file which explains all of them. Have fun!
Dump and backup a database on shutdown
I’m using Multipass as the virtualisation tool for quickly setting up virtual development machines on my Mac Studio using cloud-init for configuration and setting everything up. This really works great and has saved me several times where stuff crashed and burned, it was really easy just to tear everything down and re-run the setup scripts. (You can read more on my setup in the repository I use for this. This works fine as my development stuff is mostly in stored in Git and the data in a shared MySQL virtual server but as I recently found out this is not lways the case. Sometimes there is local data on the virtual server that you would like to keep.
The solution I came up with to prevent the loss of data is to trigger a script on the shutdown of the server that would copy the relevant data to a safe location. In my case that would be an S3 bucket. I took some digging, searching and testing but I got it working. So if you are looking for something similar, here how I did it:
We use a system service that runs at the start of the shutdown proces, so that other services that we rely on are still running. I’ve named it my S3shutdown.service which is the name of a file which you need to create in /etc/systemd/system/
with the follwing content:
[Unit]
Description=Save database to S3
Before=shutdown.target reboot.target halt.target
[Service]
Type=oneshot
RemainAfterExit=true
ExecStop=/home/ubuntu/projects/dumpandstore.sh
[Install]
WantedBy=multi-user.target
Where the first line is a descriptive title which you will see used in syslog when it is executed. The last line defines the runtime, so before the multi user mode ends. Referenced by ExecStop
you reference the shell script that should be run at the moment the server is going down.
My dumpandstore.sh
script looks like:
#! /bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
/usr/bin/mysqldump -uuser -ppassword databasename > /home/ubuntu/projects/databasedump.sql;
today=$(date +%Y%m%d);
cp /home/ubuntu/projects/databasedump.sql.sql /home/ubuntu/projects//databasedump$today.sql
/usr/bin/gzip /home/ubuntu/projects//databasedump.sql$today.sql
/usr/local/bin/aws s3 cp /home/ubuntu/projects/databasedump.sql$today.sql.gz s3://mybucketname/
/usr/local/bin/aws s3 cp /home/ubuntu/projects/databasedump.sql s3://mybucketname/
I’ve used a dump with a data to build some historic perspective, the other file without data is so to speak the last copy and is also referenced in the build script of the server. So that when I rebuild the server the database is filled with the last used dataset.
To activate the service you’ll need to run the command: sudo systemctl enable S3shutdown.service
Reboot the machine and everything should be working as intended. Some problem struggled with was the aws comfiguration. I had setup the aws configuration including credentials as a normal user but the shutdown service runs as root and therefore the aws command cloud not locate the proper credentials. This was quicky solved by copying the ~/.aws
directory to /root
Not ideal but it made it work for the moment, I need to do more research for a more elegant and safer solution.
The command line is the future of interaction
Yesterday I read this blog post from Lukas Mathis. It kept resonating with me as I recognised some of the scenarios and could even come up with some more. As an avid Apple user and forced to work on a Windows based platform I regularly need to google for instructions on how to perform certain tasks. I want to make a special mention to the Microsoft Office suite where the menu structure is unintuitive with an inconsistent toolbar where I also end up using the Help function a lot to find the right option.
This could mean a big change in UI design moving away from WIMP, touch interfacing and could be the precursor to voice interaction with the desktop. This is already possible especially on a Mac with Accessibility options enabled. But for that to happen in a common setting we first need to enable sub vocal interaction. It is socially awkward to talk to your mobile or computer in a public space and can you visualise an office space with everybody talking to their computer, very noisy. I recently saw this post which means that it won’t take long before we see a practical solution to this.
Looking forward to this…
Using a central virtual MySQL server
For all my projects I’ve been using dedicated virtual machines which I manage and configure using Vagrant. In this manner it was easy to manage a dedicated environment where you won’t have conflicting settings or libraries that was easily recreated on the fly. Every project with it’s own virtual machine get’s all the components installed it needs. With at least 5 or 6 virtual machines running on my personal iMac (which is an older model from 2013) it was getting a bit busy. One common component installed on all my machines was MySQL, which is still my go to database for simple projects. So I’ve been toying with the idea of creating a single virtual machine that only runs MySQL for all my projects. I could even host this virtual database server on an even older Mac Mini (from 2010) which I still keep around. It used to be my generic media machine untill an Apple TV took over it’s role.
At first setup everything looked great, it all went well when running on the same host (the iMac). But when I hosted the virtual database server on the Mac Mini things started to go wrong and I couldn’t make a connection to the database. While locally everything went well, going over the network was the problem. Several things to check: Was my virtual machine accepting remote connections. Yes, I’ve enabled the option: config.vm.network "public_network"
Next was connectivity to MqSQL. I learned that the skip-networking option which one usually used to secure your connectivity to the outside world has been deprecated. Instead the network connectivity is linked to the network interface of your (virtual) server. It’s got three options:
- Only acces from the local host
- Access from all networks
- Access only from one network
Only acces from the local host
Here, the bind-address takes a value of 127.0.0.1, the loopback IP address. MySQL can only be accessed by applications running on the same host.
Access from all networks
MySQL listening to all networks then the bind-address configuration is IP as 0.0.0.0. With this setting MySQL listens from all networks. Furthermore, to permit both IPv4 and IPv6 connections on all server interfaces,
Access only from one network
MySQL is allowed to listen only to a specific network interface. The value, in this case, will be
the ip-address of the specific network interface for instance: 192.168.1.1
So when I adjusted the settings for mysqld in /etc/mysql/mysql.conf.d/mysqld.cnf
and changed bind-address=127.0.0.1
into bind-address=0.0.0.0
and restarted mysqld everything connected and started working properly!
Next step is migrating all active projects to the virtual central MySQL server and see if there are any performance benefits.
Sidenote: I’ve learned that to make sure you can rebuild your database server on the fly you’ll have to make a backup of your data before you halt or destroy the virtual server. I’ve done this via a trigger configuration in my Vagrantfile that makes a dump of the database on demand to a shared folder. Just add the following lines to your Vagrantfile:
config.trigger.before [:halt, :destroy] do |trigger|
trigger.warn = "Dumping database to /vagrant/Code/dbserveroutfile.sql"
trigger.run_remote = {inline: "mysqldump -u<username> -p<password> --all-databases --single-transaction --events > /vagrant/Code/dbserveroutfile.sql; "}
end
Celebrating Steve
It’s been ten years, and I still can’t help but wonder at what Steve would do. Even with my own work I consider what he would think of it. Is it simple enough?
The DRI of your career #
I found this a good read, it set the focus on your career instead of your current job. (DRI meaning the Direct Responsible Individual.) It also talks about the all important work/life balance and the focus on growth. I like the idea that your current job is not the end goal but a stepping stone into your career. Look at what your current job can bring in value to you and your career, if the balance ends up to be negative then find the next job.
At DuckDuckGo, there’s an expression: “You are the DRI of your career” (DRI: Directly Responsible Individual). I like this, both as an individual who has always felt like the DRI of my own career, and I like it as a manager because I think it makes the boundaries of what you can and can’t do for people clear.
It reminded me of the time I was a manager and scared one of my team members by suggesting that if he wasn’t happy I would happily help him to find an opportunity elsewhere.
No more iPhone mini?
My first iPhone was a 4s and I loved them ever since. Got a 6 and then an 8, the next upgrade was difficult. I loved the form factor and size of the 4, 5 and 6. The 8 had become slightly larger but was thinner. Newer iPhones only seemed to get bigger and bigger which was hard to accept. Luckily the SE stayed small so if my iPhone should fail I could get that one.
I was over the moon when Apple released the iPhone 12 mini, it was powerful and above all other features it was my favorite form factor, small! Due to the pandemic it was hard to get my hands on one, I like to hold and touch phones (and other stuff) before I buy. The hold and feel of stuff is important to me. So when I got the chance to hold an iPhone 12 mini and admire the new blue color I was sold. I bought it directly. It was a while that I was this happy with a new iPhone, compared to the 8 it was a screamer on speed and the photo’s where a lot better.
Then new items started appearing that the mini wasn’t really selling? I didn’t and still don’t understand it because I heard a lot of people looking for the smaller iPhone. Now this rumor appeared in the register that Apple might discontinue the mini after the 13. I really hope not… Otherwise my 12 has to last for quite some time before I can buy a new iPhone.
20 years of Mac OSX
Twenty years ago the first official release of Mac OSX became available: News article. I still remember my own personal switch to OSX. Till then I was still using Windows for day to day computing and Linux and Solaris for tinkering. OSX brought the best of those two worlds together for me, easy day to day computing with usable software in a nice GUI and all the Unix I could handle. It became just a matter of time before I would switch.
My first encounter with Apple was at work with a classic, the original Macintosh, it was the computer which everyone used to make drawings which could be imported in Wordperfect om an MS-DOS PC (this was the pre Windows era). My parents have always used a Mac, from the first Performa, the colorful translucent iMac, then every iteration of iMac design that followed to which now seems to end for them in an iPad.
My first Mac was the first Mac mini that was released with the G4 PowerPC with a second one quickly after that. The first intel version I got was a year later. In that period I was very busy with getting my mail server configuration working (see switch.richard5.net and diymacserver.com for more information)