For anyone who is interested in processing their data offline, this project may be a good start:
https://github.com/tcgoetz/GarminDB
I tried it out on Debian 12, as follows, it differs a bit from the instructions on the project page, but I had not used Python before, which may cause the difference:
mkdir garmindb
python3 -m venv garmindb
cd garmindb/
source bin/activate
pip install garmindb
mkdir ~/.GarminDb
cp -p ./lib/python3.13/site-packages/garmindb/GarminConnectConfig.json.example ~/.GarminDb/GarminConnectConfig.json
vi ~/.GarminDb/GarminConnectConfig.json # Enter your credentials, and set the start dates for downloading files to a week or so for a first test, as it can take very long
pip install fitfile
pip install idbutils
pip install tcxfile
# (Normally I don't have data in Connect, I uploaded a few activities and monitor files for this test)
# To initialize the directory structure, a one-time download from Garmin Connect was done:
garmindb_cli.py --all --download --import --analyze
I installed DB browser for SQLite via the package manager, the data was there (not as complete as I'd hoped for unfortunately, but the basics were there).
From there on, I don't need the downloads from Connect anymore.
To import a part of my archive of FIT files, I renamed them (small bash script) from for example: 2025-08-21-11-37-58.fit to: 20250821113_ACTIVITY.fit because the import doesn't like '-' in filenames. Not sure the _ACTIVITY is necessary, but it worked.
Placed the files in ~/.HealthData/FitFiles/Activities and imported them:
garmindb_cli.py --import --activities --analyze
Then ran some SQL select statements on the data in the DB browser for SQLite, made a few graphs, it worked great.
Now hoping I can get more of the data out of the FIT files, and that I can get it in Postgres and in Grafana,