Monday, April 27, 2026

Weather rock uncovering

 After putting off finishing the rest of the environmental or weather readings from the nearest airport into a Zabbix suite long enough, I standardized the newest release level 7.4 using FreeBSD.


You can't use the Zabbix send tool for random text without it going somewhere structured so I first set up the variables, naming with Ambient and tagging with Environment. A typical METAR weather file pull looks like:


Baltimore / Martin, MD, United States (KMTN) 39-20N 076-25W
Jun 11, 2022 - 06:54 PM EDT / 2022.06.11 2254 UTC
Wind: Calm:0
Visibility: 10 mile(s):0
Sky conditions: partly cloudy
Temperature: 69 F (21 C)
Dew Point: 59 F (15 C)
Relative Humidity: 68%
Pressure (altimeter): 29.96 in. Hg (1014 hPa)
ob: KMTN 112254Z 00000KT 10SM SCT180 21/15 A2996
cycle: 23

In hindsight, I erred not adding a separator on the pressure column names. Visibility is reported only in miles; adding Km could be a derived field in Zabbix.

The command line I used is, ish:

export ZABBIX_SEND="zabbix_sender -vv -z "${ZABBIX_SERV}" -p 10051 -s "${ZABBIX_HOST}" -k "

Each key is from a row and column in the weather dump, neatly prefixed line-by-line and at most 2 or 3 datapoints per row. I have wind rose in text style, but not "sky conditions"

Wind: from the S (190 degrees) at 7 MPH (6 KT):0



The early-2026 cold snap is quite evident, over days, looking back at a 7.0 level Zabbix server.

The shell script has a bunch of grep/sed/awk when Perl or whatever would be slicker. This just grew piece by piece. We want it to run for instance:

  zabbix_sender -vv -z 127.0.0.1 -p 10051 -s place -k  "enviro[Temperature.Celsius]" -o  8



# echo C
grep  "^Temperature: "           $DATAFILE  | sed -e "s/(//" -e "s/)//"   | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[Temperature.Celsius\]\"    -o ", $4}'
# echo F
grep  "^Temperature: "           $DATAFILE  | sed -e "s/(//" -e "s/)//"   | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[Temperature.Fahrenheit\]\" -o ", $2}'

# echo dew
grep  "^Dew Point: "             $DATAFILE  | sed -e "s/(//" -e "s/)//"   | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[DewPoint.Fahrenheit\]\"    -o " $3}'
grep  "^Dew Point: "             $DATAFILE  | sed -e "s/(//" -e "s/)//"   | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[DewPoint.Celsius\]\"    -o " $5}'

# echo hum
grep  "^Relative Humidity: "     $DATAFILE  | sed -e "s/%//"              | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[Humidity\]\"               -o ", $3}'
grep  "^Pressure (altimeter): "  $DATAFILE  | sed -e "s/(//" -e "s/)//"   | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[PressureHG\]\"             -o ", $3}'
grep  "^Pressure (altimeter): "  $DATAFILE  | sed -e "s/(//g" -e "s/)//g" | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[PressurePA\]\"  -o ", $6}'

grep  "^Visibility: " $DATAFILE  | sed -e "s/(//g" -e "s/)//g" | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[Visibility]\" -o ", $2}'
grep  "^Wind: " $DATAFILE  | sed -e "s/(//g" -e "s/)//g" | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[Wind.Rose\]\"        -o ", $4}'
grep  "^Wind: " $DATAFILE  | sed -e "s/(//g" -e "s/)//g" | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[Wind.Direction\]\"   -o ", $5}'
grep  "^Wind: " $DATAFILE  | sed -e "s/(//g" -e "s/)//g" | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[Wind.Speed\]\"    -o ", $8}'

# eof
grep  "^ob" $DATAFILE  | sed -e "s/(//g" -e "s/)//g" | awk -v zs="$ZABBIX_SEND" '{print zs, "\"enviro\[Metar\]\" -o \"" substr($0,5,99) "\""  }'

 
After collecting wind direction for a few hours, the results look as expected.

TimestampValue
2026-04-27 05:21:01 PM
S
2026-04-27 04:21:01 PM
S
2026-04-27 03:21:01 PM
S
2026-04-27 02:21:01 PM
S
2026-04-27 01:21:01 PM
S
2026-04-27 12:21:01 PM
S
2026-04-27 11:21:01 AM
S
2026-04-27 03:21:01 AM
NNW
2026-04-26 10:21:01 PM
SE
2026-04-26 07:21:01 PM
ESE
2026-04-26 06:21:01 PM
ESE

Saturday, April 4, 2026

Key flipping from sheet to form

 After I wrangled 2 spreadsheets (and 1 Google sheet) into database tables, my first picks for primary keys needed to be changed based on how the data were and how I wanted things to be.

Whoever did the manual entry into rows and columns used month and year as the key separators, since the events were generally produced every month (for literal decades). So it made sense that the first attempt used these as keys. I don't like to use a serial number key for tasks like this as part of the outcome should be normalizing the data; getting an error message that there are duplicate months in a year needs to be investigated.

Among other things I found were complete cast/crew duplicates for 1 show that ran for 2 months (others had only ditto marks, making them easier to spot and tidy up). I needed to switch from a monthly key to a sequence value because after 2000, the shows were scheduled for several week runs but not staying within a month. And there were weekend shows, plus multi-play productions in single ticket access.

 References at the end, one LibreOffice and two DBA Stack Exchange

I started with this key: 

   primary key (year_str, month_str, production, director, person, role)

 The year and month are strings, instead of integers because there were originally a variety of formats which I fixed after initial load.

The propose change: 

ALTER TABLE cast_cooked DROP CONSTRAINT cast_cooked_pkey;
ALTER TABLE cast_cooked ADD PRIMARY KEY (season, sequence, production, person, role, band_cast_crew);

 

Using the example questions, I deleted duplicate rows as I found them, mainly trial-and-error because the first line above always works and the second only when all dupes are clear.

I was overzealous in thinking I should include what turned out to be an indistinct key:

psql:alter_key.sql:3: ERROR:  constraint "cast_cooked_pkey" of relation "cast_cooked" does not exist

psql:alter_key.sql:4: ERROR:  column "sub_seq" of relation "cast_cooked" contains null values

 

All nulls are the same, not unique. Turns out this key also messed with LibreOffice forms, as mismatched tables showed no rows in one (scary!). Had to be removed from the primary key, or filled somehow; I chose the former as faster.

 

ALTER TABLE
psql:alter_key.sql:4: ERROR:  could not create unique index "cast_cooked_pkey"
DETAIL:  Key (season, sequence, sub_seq, person, role)=(4, 11, C, First Last, Director) is duplicated.


These were usually easy, except for the 44 rows of cast+crew dupes, and then one complete duplicate (don't ask).

db=> delete  from cast_cooked where season = 12 and sequence = 7 and month_str = '5' ;

DELETE 44

 
A simple statement but should be sanity claused.

Drop the table constraint and retry:

psql:alter_key.sql:3: ERROR:  constraint "cast_cooked_pkey" of relation "cast_cooked" does not exist

psql:alter_key.sql:4: ERROR:  could not create unique index "cast_cooked_pkey"

DETAIL:  Key (season, sequence, production, person, role, band_cast_crew)=(26, 3, Scrooge (shocking), First Last, Director, crew) is duplicated.


 db=>  select * from cast_cooked where season = 26 and sequence = 3 and role = 'Director' order by month_str, person;
  year_str  | month_str  |     production     |    director     |     person      |   role   | band_cast_crew | season | sequence | opening | closing | sub_seq
------------+------------+--------------------+-----------------+-----------------+----------+----------------+--------+----------+---------+---------+---------
 87         | 12         | Scrooge (shocking) | First Last| First Last  | Director | crew           |     26 |        3 |         |         |
 87         | 12         | Scrooge (shocking) | Firss Last| First Last | Director | crew           |     26 |        3 |         |         |

 There was another typo I redacted, duplicated, as director was not a key value.

To remove a duplicate you might need a "ROW ID" which in PostgreSQL acts like this:

db => select ctid from cast_cooked where season = 26 and sequence = 3 and role = 'Director' order by month_str, person;
  ctid
--------
 (5,14)
 (5,30)
(2 rows)

db=> delete  from cast_cooked where season = 26 and sequence = 3 and role = 'Director' and ctid = '(5,30)';
DELETE 1
 

Then the anticlimactic "done" success message (along with the "you already did that" warning).

 db=> \i alter_key.sql
psql:alter_key.sql:3: ERROR:  constraint "cast_cooked_pkey" of relation "cast_cooked" does not exist
ALTER TABLE

db=>

 

And now, better queries!


 

Two-step also:


 

~ ~ ~

References:

https://books.libreoffice.org/en/BG72/BG7204-Forms.html#toc41
https://dba.stackexchange.com/questions/138320/duplicate-rows-how-to-remove-one
https://dba.stackexchange.com/questions/159793/how-to-set-a-not-null-column-into-null-in-postgresql

 

Monday, February 2, 2026

Shrinking and wrapping the copyrights to own imagery

The new web site is "responsive" or so they say. Large images are sliced and diced into smaller versions for display on a small device window, saving load time. The smarty-pants part is it figuring out your screen across the board.

Checking what was viewable and what the image tags looked like, I grabbed a bunch  / with a tip of the sombrero to online:

https://stackoverflow.com/questions/4602153/how-do-i-use-wget-to-download-all-images-into-a-single-folder-from-a-url

$ ls -l *rocky*

spot lighters   77360 Jun 20  2025 6855cb[*]_rocky horror_9235210_orig-p-500.jpg

spot lighters  211763 Jun 20  2025 6855cb[*]_rocky horror_9235210_orig.jpg

I used the A list "ico, SVG, svg, jpeg, jpg, bmp, gif, png, heic" but no SVG came through. And PDF failed in a robotic way.

The p-500 means the file got downsized by the automat to fit in a 500-pixel holey window.

$ exiftool *rocky* | egrep "^File Size|^Image Size"
File Size                       : 76 KiB
Image Size                      : 500x752

File Size                       : 207 KiB
Image Size                      : 532x800

Cut about 2/3 in size for only 500 pixels versus 532.

One has the tags I embossed, at least on the largest chunk.

  31698 Dec 21 01:00 69473f12[*]0c_D48a-p-500.jpg
  68603 Dec 21 01:00 69473f12[*]0c_D48a-p-800.jpg
  81388 Dec 21 01:00 69473f12[*]0c_D48a.jpg

PDF tag? sure, why not.

$ strings *D48* | grep -i spath
  <pdf:Author>Jim Spath</pdf:Author>
$ strings *D48*500* | grep -i spath
$ strings *D48*800* | grep -i spath

On the other site, images over 2000 pixels wide get de-tagged.

Spotlighters-Purchase-Tickets-Icon_Ticket-2.jpg
Your image was automatically resized to 2000x1085 pixels. 

Spotlighters-Logo.jpg
Your image was automatically resized to 2000x882 pixels. 

Once that happens, the tell-tale is a different processor. I'm guessing GD beneath, maybe Python middleware?

$ jhead  Spotlighters_Purchase_Tickets_Icon.jpg
File name    : Spotlighters_Purchase_Tickets_Icon.jpg
File size    : 126559 bytes
File date    : 2026:01:31 22:13:11
Resolution   : 2000 x 1085
JPEG Quality : 75
Comment      : CREATOR: gd-jpeg v1.0 (using IJG JPEG v62), default quality

Fixer upper: 

$ time exiftool   -author="Art Ist"   -copyright="CC BY-NC-SA 4.0 Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International This work is licensed under CC BY-NC-SA 4.0. To view a copy of this license, visit https://creativecommons.org/licenses/by-nc-sa/4.0/" -subject="Art"    -title="For Spotlighters"  -datetime="2026-01-31 00:00:00"  Logo-2000x882.jpg
    1 image files updated

What else?
 Oh, yah. Pictures or it didn't happen.



Still looking for where I snipped off the orientation.


Sunday, January 11, 2026

Note to Self, Mainly, on Shrinking Map Colors

 I am helping a theatre redo their website, from a post-Y2K relic to one that promises responsiveness (their phrase). They make a big deal about putting warning flags on images over 1MB, or a certain pixel size (like 800x1340). On the "directions and parking" section the old site had text directions, probably copied from Mapquest in 2002 or so. I kept the narrative and added static maps, since I didn't want to try a "live" google map image, figuring most folks would put the address in the own GPS/map app.

  The OSMAND app works great, so I figured I'd do screenshots and show the suggested routes, whether they lined up with the old text or not. They looked good, full screen PNG files with nice colors. Only they ended up way too big for a quick page load.

-rw-rw-rw-  1 user  users  1586661 Dec 14 22:19 map_from_westminster.png

When I saved them as JPEG, they didn't shrink much, just got a little fuzzier. Smaller size images didn't look great either. So I turned to the old standby library of graphics fixer-uppers, NetPBM [ netpbm.sourceforge.net/doc/ ]

The classic set of 216 colors (AKA "Web Safe") made sense as a target palette, as long as the altered image looked decent. I recall doing color reduction in the past and making some pictures that didn't. The docs said create a palette file, like so:

pamseq 3 5 >websafe.pam

Now, I had a 3 color band file, 5 values deep, just like Netscape likes. Applying that to a large image took almost no time:

pnmremap -map=websafe.pam map_from_westminster.pnm >websafe_west.ppm

Note this skips the PNG to PNM and PPM to PNG steps, which I added in a shell script that rotated through the 8 compass points of map/directions I wanted (N, NE, E, etc.)

#!/bin/sh

for img in \

 img1 \

 img2 \


do

  pngtopnm $img.png >$img.pnm

  pnmremap -map=websafe.pam $img.pnm | pnmtopng > ${img}_c216.png

done


Even on a slow drive, this is quick.

$ time pngtopnm map_from_westminster.png | pnmremap -map=websafe.pam - | pnmtopng >west_c26.png
pnmremap: 216 colors found in colormap

real    0m0.246s
user    0m0.260s
sys     0m0.017s

$ ls -l west_c26.png
-rw-r--r--  1 user  users  282243 Jan 12 00:01 west_c26.png


From 1.5 MB to 276 KB: good shrink!


$ perl $EXIFTOOL *west* | egrep "File Size|Image Size|Color Type"
File Size                       : 1549 KiB
Color Type                      : RGB
Image Size                      : 800x1340
File Size                       : 276 KiB
Color Type                      : Palette
Image Size                      : 800x1340

Afterward, I reapplied metadata such as Author and Copyright to the PNG file with the awesome exiftool.

Yet to do: add an overlay showing the OpenStreetMap sources used.




Tuesday, November 4, 2025

Scout Bot What What?

 Different question, same answer:

= =

Scoutie

Hello I am Scoutly, an AI-based chatbot that can help answer general questions about scouting and help you with common tasks. We value your privacy and will keep your information secure. How can I help you today?

User: what is a variable?

Scoutie Scoutly:

A variable is a container in programming that can store different values or information. It's like a labeled box where you can put data that might change, such as numbers, text, or other types of information. For example, a variable could store your age, a game score, or a person's name.

User: Can a variable be a constant?

Scoutie Scoutly:

A variable is a container in programming that can store different values or information. It's like a labeled box where you can put data that might change, such as numbers, text, or other types of information. For example, a variable could store your age, a game score, or a person's name. 

= =


Wednesday, October 22, 2025

Zabbix database lift + shift PostgreSQL migration

 I've been running Zabbix a few years now, and typically put new versions on different hardware as I explore how much power and network are needed even for a relatively small set of hosts. Underpowered components show up as dropped connections and lost data cycles.

I decides to move up the underlying databases on 2 instances from 14 and 15 to both run on PGSQL 16. As I have the databases on NetBSD, I could not easily use the pg_upgrade tool. That needs the source and target versions to both run on one host (as as as I can see), and the NetBSD standard pkgsrc only permits one version. I didn't have the time or skills to install outside the package source.

So, I went with a database export, re-install the database server software, then import. I didn't rush things, and had less than 2 hours of down-time, using a fairly old hardware box (2014 / AMD Family 16h (686-class), 1996.25 MHz).

Much of the time was upgrading or installing about 700 packages that have gone stale in the several years since I touched that box other than small package installs.

20 minutes for the first try:

---Oct 14 00:20:32: [1/780] upgrading freeglut-3.6.0...

---Oct 14 00:40:31: [780/780] upgrading tex-latex-20241101...

Then, after clearing the deadwood and trying again a bit later after the database, the POST GIS extension (times in GMT):

---Oct 14 01:43:15: [1/14] installing tiff-4.7.0nb3...
---Oct 14 01:44:36: [14/14] installing postgresql16-postgis-3.6.0...

Times in EST5EDT:



DB dump


Under a minute, with SSD.

$ time pg_dump -U zabbixadmin -d zabbix -h localhost > zabbix_dump.dmp 2>zabbix.err
real    0m52.698s
user    0m12.638s
sys     0m3.256s

A medium size home net maybe? Around 1GB.

-rw-r--r--  1 jim  wheel  1131407116 Oct 13 23:04 zabbix_dump.dmp

Move older version aside, so if things go sideways a dropback to the prior version is the "save point." We hope not to test this.

# mv pgsql pgsql_15
$ ls -l /usr/pkg/pg*
ls: /usr/pkg/pgsql_15: Permission denied

780 packages
pkg_install warnings: 2, errors: 329
pkg_install error log can be found in /var/db/pkgin/pkg_install-err.log
reading local summary...
processing local summary...
(time passes)

NetBSD pkgsrc database boot

bash-5.2# /etc/rc.d/pgsql initdb
Initializing PostgreSQL databases.

(then create the role for zabbixadmin)

Zabbix specific database steps

unix:
createdb -O zabbix -E Unicode -T template0 zabbix

sql:
GRANT ALL ON SCHEMA public TO                   zabbixadmin;
GRANT ALL PRIVILEGES on database  zabbix    to  zabbixadmin;
GRANT CONNECT on database         zabbix    to  zabbixadmin;
ALTER ROLE                                      zabbixadmin WITH LOGIN;

Restore


$ time psql -d zabbix < zabbix_dump.dmp >~/zabbix.log 2>~/zabbix.err

real    2m59.684s
user    0m12.635s
sys     0m2.364s

As the saying goes, 2 minutes 59.

The image above illustrates the somewhat minimal downtime, where the 90 minute gap between the arrows is the entire loss of one data element.



Monday, October 6, 2025

First Contact Scoutly Automaton

 To teach a badge I'd better know a bit more than the student, or how to adjust to learning on the fly.

So I asked the Scouting bot cutely named Scoutly, how to connect a Scout to a counselor.

How does a unit leader approve a Scout to start a Merit badge?

The answer imagined only a blue card universe with signed paper forms. Odd for a new piece of code to go way back. Maybe there's more data pre-Scoutbook digital workflow. 

Scoutly:A unit leader approves a Scout to start a Merit Badge by: 1. Providing the Scout with a Merit Badge counselor's name who specializes in the badge's subject 2. Giving the Scout a signed Application for Merit Badge form 3. Encouraging the Scout to contact the counselor to discuss the badge requirements

I "refined" the query to add "in Scoutbook" as we used to add "in ___" jokingly. 

How does a unit leader approve a Scout to start a Merit badge in Scoutbook?

And waited. Minutes of 3 waving dots. The session reset so I asked the same question with a quicker reply.

The answer was: reply hazy try again style non-answer.

Scoutly:In Scoutbook, the process for a unit leader to approve a Scout starting a Merit Badge typically involves: 1. The Scout discusses their interest in a specific Merit Badge with the unit leader 2. The unit leader helps connect the Scout with an appropriate Merit Badge counselor 3. The Scout obtains a signed Application for Merit Badge 4. The counselor works with the Scout to explain requirements and begin the process While the exact Scoutbook digital approval process isn't detailed in the available information, the key is communication between the Scout, unit leader, and Merit Badge counselor.

Differently wrong. 

And yet another diff on the tell me three times it is true:

While the specific Scoutbook approval process isn't detailed, here's the typical Merit Badge approval process:

. The Scout discusses their interest in a specific Merit Badge with their unit leader

. The Scout reviews the requirements for the Merit Badge

. The unit leader provides the name of a qualified Merit Badge counselor

. The Scout obtains a signed Application for Merit Badge from the unit leader

For the exact Scoutbook digital approval process, I recommend contacting your local Scouting America council or checking the Scoutbook help resources directly.

When I went to the Google search page with the same query it knew more about Scoutbook than Scouting's app, for now at least. 

https://help.scoutbook.scouting.org/knowledge-base/how-do-i-assign-merit-badge-counselors-to-a-scout/

Pictures or it didn't happen. Really. 

First contact--blue card

Revision 2, waiting

Revision 2, still waiting 

Waiting 

A different answer

The first 2 answers had numbered segments while the third, with the same question as the 2nd, shows bullet points. Sure, why not.

"Check the Scoutbook help resources directly."

Dude,  give me a link. How hard is it to show that?