Compare commits

..

102 Commits

Author SHA1 Message Date
Gitea Actions Bot
033072d519 Auto-update README with current k8s applications
All checks were successful
Terraform / Terraform (pull_request) Successful in 18s
Generated by CI/CD workflow on 2026-02-04 12:16:26

This PR updates the README.md file with the current list of applications found in the k8s/ directory structure.
2026-02-04 12:16:26 +00:00
Ultradesu
4981fef85d Adjusted searxng deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 9s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 7s
2026-02-04 14:16:00 +02:00
Ultradesu
49515d6081 Adjusted searxng deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-04 14:14:38 +02:00
Ultradesu
d0895497fb Adjusted searxng deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-04 14:10:29 +02:00
Ultradesu
291fafad58 Adjusted searxng deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 4s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 14:08:53 +02:00
Ultradesu
48ee1fcd10 Adjusted searxng deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-04 14:07:11 +02:00
ab
2bcc0f9414 Update k8s/apps/n8n/values-searxng.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 02:29:37 +00:00
ab
dd191c1c6e Update k8s/apps/n8n/values-searxng.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-04 02:24:35 +00:00
ab
0a101d7b98 Update k8s/apps/n8n/values-n8n.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 10s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-04 00:48:54 +00:00
ab
98d6d53b09 Update k8s/apps/n8n/values-n8n.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-04 00:38:36 +00:00
ab
0f28d93647 Update k8s/apps/n8n/values-n8n.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 4s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 00:27:37 +00:00
ab
16b3a7fdcb Update k8s/apps/n8n/external-secrets.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 00:26:59 +00:00
AB from home.homenet
8952396c9b Fixed n8n permissions
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 02:21:39 +02:00
ab
88e5b5f1b7 Update k8s/apps/n8n/external-secrets.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 23s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 00:15:46 +00:00
AB from home.homenet
6f8ca40108 Fixed n8n permissions
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 4s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-04 02:02:28 +02:00
AB from home.homenet
da37ae71be Fixed n8n permissions
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 02:00:54 +02:00
AB from home.homenet
a4a1fecbd1 Added storage for n8n
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 01:54:44 +02:00
AB from home.homenet
3564f5d9c3 Added storage for n8n
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 01:52:16 +02:00
AB from home.homenet
9df0a3c8b7 Added storage for n8n
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 01:49:56 +02:00
ab
3157b0c325 Update k8s/apps/n8n/external-secrets.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-03 23:44:31 +00:00
AB from home.homenet
cbe1c23709 Added searxng
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 4s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-04 01:27:12 +02:00
AB from home.homenet
70198ca1c2 Added searxng
Some checks failed
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Failing after 9s
2026-02-04 01:25:31 +02:00
AB from home.homenet
39207fcb39 Added n8n
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-04 00:07:02 +02:00
AB from home.homenet
bae89f8738 Added n8n
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 28s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-02-03 23:24:24 +02:00
AB from home.homenet
72950dae83 Added n8n
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-02-03 23:22:15 +02:00
ab
9096b4bcf7 Update k8s/core/postgresql/external-secrets.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 14s
Check with kubeconform / lint (push) Successful in 14s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-02-03 21:10:57 +00:00
Ultradesu
8b6b0a0cd6 Upgrade BW CLI image
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-02-02 17:38:26 +02:00
ab
9024a41a4c Update k8s/core/external-secrets/bitwarden-store.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-02-02 15:37:32 +00:00
ab
2b5e76e24d Update k8s/apps/syncthing/kustomization.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-02-02 15:29:53 +00:00
Ultradesu
651acf665e Moved immich to pvc
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-28 12:04:41 +02:00
ab
7bd482a000 Update k8s/core/kube-system-custom/nfs-storage.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-28 09:39:18 +00:00
ab
6a0be650ea Update k8s/apps/pasarguard/daemonset.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-26 10:23:35 +00:00
AB
b78efdb81a paperless-ai deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-21 16:42:58 +02:00
AB
7bf27506b0 paperless-ai deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-21 16:37:33 +02:00
AB
5e3be29b7a paperless-ai deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-21 16:36:14 +02:00
AB
44ce19b815 paperless-ai deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-21 02:50:25 +02:00
AB
434b947896 paperless-ai deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-21 02:38:36 +02:00
AB
b13107882c paperless-ai deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-21 02:29:18 +02:00
AB
f76d44ce98 paperless-ai deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 6s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-21 02:27:00 +02:00
AB
43c0fdf2f2 paperless-ai deploy
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-21 02:21:22 +02:00
AB
512eaf842e Fix certmanager
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 3s
2026-01-20 14:44:31 +02:00
AB
91cd967989 Clean up pasarguard
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 3s
2026-01-20 14:27:50 +02:00
ab
d092401cd6 Update k8s/apps/pasarguard/daemonset.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 9s
Check with kubeconform / lint (push) Successful in 9s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-20 07:56:31 +00:00
ab
5b2768ad4f Update k8s/core/external-secrets/bitwarden-store.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 9s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-18 19:54:43 +00:00
ab
a7d71e40a3 Update k8s/apps/pasarguard/daemonset.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-14 18:48:17 +00:00
ab
9cbaa23aed Update k8s/apps/paperless/paperless-values.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 9s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-13 08:03:58 +00:00
Ultradesu
0b3fddbd40 Fix desubot storage
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-12 18:31:01 +02:00
Ultradesu
ee0c55dcea Fix desubot storage
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-12 18:12:10 +02:00
Ultradesu
f545a47162 Fix desubot storage
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-12 18:00:09 +02:00
Ultradesu
fdbeb1a9a7 Fix desubot storage 2026-01-12 18:00:01 +02:00
Ultradesu
0fbeb96a6a Fix desubot storage
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-12 17:57:42 +02:00
Ultradesu
0eba6143f4 Fix desubot storage
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-12 17:51:32 +02:00
Ultradesu
b71f54f714 Added nfs pvc to desubot
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 10s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-12 17:29:45 +02:00
Ultradesu
9a3bdfaf9c Added nfs pvc to desubot 2026-01-12 17:29:37 +02:00
Ultradesu
cfa275f4fc Added NFS SC
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-12 17:26:09 +02:00
Ultradesu
4a887b6775 Added NFS SC
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-12 17:25:30 +02:00
ab
981aa2ba15 Update k8s/core/system-upgrade/plan.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-09 01:37:52 +00:00
872c0f4adf Fixed alert template
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 10s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-09 01:26:33 +00:00
ab
5b1ff26975 Update k8s/apps/pasarguard/daemonset.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 13s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-09 01:02:14 +00:00
ab
93bf782ece Update k8s/apps/pasarguard/daemonset.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 13s
Check with kubeconform / lint (push) Successful in 9s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-09 00:57:42 +00:00
f153bfc0b4 Fixed alert template
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 17:34:28 +00:00
6b60fca39c Fixed alert еуьздфеу
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 3s
2026-01-08 17:32:25 +00:00
abb47a6db0 Fixed alert
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 10s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 17:25:50 +00:00
ab
e008ac3f59 Update k8s/core/argocd/values.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-08 17:18:49 +00:00
c945575ea1 Fixed alert
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 17:17:21 +00:00
01348dd99e Fixed alert
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 17:03:20 +00:00
0b4507a72d Adjust TG ID to var
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 16:58:54 +00:00
5cca64813a Adjust TG ID to var
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 3s
2026-01-08 16:53:18 +00:00
837094944e Adjust TG ID to var
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 12s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 16:49:03 +00:00
7da2fab580 Adjust TG ID to var
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 16:43:35 +00:00
007df29133 Adjust TG ID to var
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-08 16:40:31 +00:00
b25a82ba1e Fix TG ID
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 16:36:02 +00:00
5e7e9031a3 Fix TG ID
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 12s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 16:29:12 +00:00
70ae7c9a50 Disable autosync
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 6s
Auto-update README / Generate README and Create MR (push) Successful in 3s
2026-01-08 16:23:22 +00:00
d95faaf2c1 Configured alerts in grafana and TG endpoint
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 10s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 16:15:24 +00:00
af7e29c063 Configured alerts in grafana and TG endpoint 2026-01-08 16:15:14 +00:00
4ea48f0f94 Moved argocd to CH
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 16:02:22 +00:00
4bfc35d8e2 Added YAML anchors to argocd selectors
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 16:00:55 +00:00
46c0fab78a Configured alerts in grafana and TG endpoint
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 7s
Auto-update README / Generate README and Create MR (push) Successful in 3s
2026-01-08 15:55:37 +00:00
6dc43149f4 Configured Alertmanager TG
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 03:41:42 +00:00
ab
ca1efe6230 Update k8s/apps/pasarguard/daemonset-ingress.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 5s
Auto-update README / Generate README and Create MR (push) Successful in 3s
2026-01-08 03:11:43 +00:00
ab
e90d2c9dc5 Update k8s/core/external-secrets/bitwarden-store.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-08 03:10:07 +00:00
ab
a884c2b969 Update terraform/authentik/main.tf
All checks were successful
Terraform / Terraform (push) Successful in 25s
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
2026-01-07 15:38:25 +00:00
ab
db92976872 Update k8s/apps/pasarguard/daemonset.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 11s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-07 15:27:06 +00:00
Ultradesu
d924ebd3ee Added promtail
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 12s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-07 15:13:54 +00:00
Ultradesu
4b30185655 Added promtail
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 12s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 4s
2026-01-07 15:01:15 +00:00
Ultradesu
a65b37f000 Added loki
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 12s
Check with kubeconform / lint (push) Successful in 8s
Auto-update README / Generate README and Create MR (push) Successful in 5s
2026-01-07 14:55:42 +00:00
Ultradesu
f394b4f9da fix CI
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 12s
2026-01-07 14:10:58 +00:00
ab
5d12fc854a Merge pull request 'Auto-update README with k8s applications' (#63) from auto-update-readme-20260107-140709 into main
Some checks failed
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Failing after 8s
Reviewed-on: #63
2026-01-07 14:08:00 +00:00
Gitea Actions Bot
f415e0711e Auto-update README with current k8s applications
All checks were successful
Terraform / Terraform (pull_request) Successful in 30s
Generated by CI/CD workflow on 2026-01-07 14:07:09

This PR updates the README.md file with the current list of applications found in the k8s/ directory structure.
2026-01-07 14:07:09 +00:00
ab
14dc69904c Merge pull request 'Auto-update README with k8s applications' (#62) from auto-update-readme-20251229-021031 into main
Some checks failed
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Failing after 4s
Reviewed-on: #62
2026-01-07 14:06:51 +00:00
ab
f6dc7aa6e3 Update k8s/apps/gitea/deployment.yaml
Some checks failed
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Failing after 29s
Check with kubeconform / lint (push) Successful in 16s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2026-01-07 14:05:52 +00:00
ab
badd82f9af Update k8s/apps/gitea/deployment.yaml
Some checks failed
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Has been cancelled
Check with kubeconform / lint (push) Has been cancelled
Auto-update README / Generate README and Create MR (push) Has been cancelled
2026-01-07 14:04:04 +00:00
ab
a5cb49471a Update k8s/core/argocd/values.yaml
Some checks failed
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Has been cancelled
Check with kubeconform / lint (push) Has been cancelled
Auto-update README / Generate README and Create MR (push) Has been cancelled
2026-01-07 13:57:04 +00:00
ab
79c23e14b0 Update k8s/apps/k8s-secrets/deployment.yaml
Some checks failed
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Has been cancelled
Check with kubeconform / lint (push) Has been cancelled
Auto-update README / Generate README and Create MR (push) Has been cancelled
2026-01-07 13:43:37 +00:00
ab
5bc44e45b0 Update terraform/authentik/proxy-apps.tfvars
All checks were successful
Terraform / Terraform (push) Successful in 32s
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 7s
2025-12-29 02:15:42 +00:00
Gitea Actions Bot
4a80f2f596 Auto-update README with current k8s applications
All checks were successful
Terraform / Terraform (pull_request) Successful in 30s
Generated by CI/CD workflow on 2025-12-29 02:10:31

This PR updates the README.md file with the current list of applications found in the k8s/ directory structure.
2025-12-29 02:10:31 +00:00
ab
b58461232c Update k8s/apps/k8s-secrets/deployment.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 11s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2025-12-29 02:09:55 +00:00
ab
be6e601275 Update k8s/apps/k8s-secrets/deployment.yaml
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 9s
Check with kubeconform / lint (push) Successful in 13s
Auto-update README / Generate README and Create MR (push) Successful in 7s
2025-12-29 02:09:03 +00:00
ab
063a4a502b Update terraform/authentik/proxy-apps.tfvars
All checks were successful
Terraform / Terraform (push) Successful in 34s
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 16s
2025-12-29 02:01:52 +00:00
Ultradesu
22382b63a1 Added UK jellyfin
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 9s
Check with kubeconform / lint (push) Successful in 13s
Auto-update README / Generate README and Create MR (push) Successful in 7s
2025-12-28 20:41:27 +00:00
Ultradesu
718709115f Added UK jellyfin
All checks were successful
Update Kubernetes Services Wiki / Generate and Update K8s Wiki (push) Successful in 8s
Check with kubeconform / lint (push) Successful in 11s
Auto-update README / Generate README and Create MR (push) Successful in 6s
2025-12-28 20:35:42 +00:00
44 changed files with 855 additions and 610 deletions

View File

@@ -22,12 +22,13 @@ jobs:
- name: Install Python dependencies
run: |
pip install pyyaml
python3 -m venv .venv
.venv/bin/pip install pyyaml
- name: Generate K8s Services Wiki
run: |
echo "📋 Starting K8s wiki generation..."
python3 .gitea/scripts/generate-k8s-wiki.py k8s/ Kubernetes-Services.md
.venv/bin/python .gitea/scripts/generate-k8s-wiki.py k8s/ Kubernetes-Services.md
if [ -f "Kubernetes-Services.md" ]; then
echo "✅ Wiki content generated successfully"

View File

@@ -44,6 +44,7 @@ ArgoCD homelab project
| **jellyfin** | [![jellyfin](https://ag.hexor.cy/api/badge?name=jellyfin&revision=true)](https://ag.hexor.cy/applications/argocd/jellyfin) |
| **k8s-secrets** | [![k8s-secrets](https://ag.hexor.cy/api/badge?name=k8s-secrets&revision=true)](https://ag.hexor.cy/applications/argocd/k8s-secrets) |
| **khm** | [![khm](https://ag.hexor.cy/api/badge?name=khm&revision=true)](https://ag.hexor.cy/applications/argocd/khm) |
| **n8n** | [![n8n](https://ag.hexor.cy/api/badge?name=n8n&revision=true)](https://ag.hexor.cy/applications/argocd/n8n) |
| **ollama** | [![ollama](https://ag.hexor.cy/api/badge?name=ollama&revision=true)](https://ag.hexor.cy/applications/argocd/ollama) |
| **paperless** | [![paperless](https://ag.hexor.cy/api/badge?name=paperless&revision=true)](https://ag.hexor.cy/applications/argocd/paperless) |
| **pasarguard** | [![pasarguard](https://ag.hexor.cy/api/badge?name=pasarguard&revision=true)](https://ag.hexor.cy/applications/argocd/pasarguard) |

View File

@@ -77,8 +77,8 @@ spec:
labels:
app: gitea-runner
spec:
nodeSelector:
kubernetes.io/hostname: home.homenet
#nodeSelector:
# kubernetes.io/hostname: home.homenet
volumes:
- name: docker-sock
hostPath:
@@ -90,27 +90,30 @@ spec:
affinity:
nodeAffinity:
preferredDuringSchedulingIgnoredDuringExecution:
- weight: 3
preference:
matchExpressions:
- key: kubernetes.io/hostname
operator: In
values:
- home.homenet
- weight: 1
preference:
matchExpressions:
- key: kubernetes.io/hostname
operator: In
values:
- master.tail2fe2d.ts.net
- home.homenet
- weight: 2
preference:
matchExpressions:
- key: kubernetes.io/hostname
operator: In
values:
- nas.homenet
- master.tail2fe2d.ts.net
- weight: 3
preference:
matchExpressions:
- key: kubernetes.io/hostname
operator: In
values:
- it.tail2fe2d.ts.net
- ch.tail2fe2d.ts.net
- us.tail2fe2d.ts.net
requiredDuringSchedulingIgnoredDuringExecution:
nodeSelectorTerms:
- matchExpressions:
@@ -118,7 +121,9 @@ spec:
operator: In
values:
- home.homenet
- nas.homenet
- it.tail2fe2d.ts.net
- ch.tail2fe2d.ts.net
- us.tail2fe2d.ts.net
- master.tail2fe2d.ts.net
containers:
- name: gitea-runner

View File

@@ -74,19 +74,14 @@ spec:
- nas.homenet
volumes:
- name: upload-storage
nfs:
server: nas.homenet
path: /mnt/storage/Storage/k8s/immich/library/
readOnly: false
persistentVolumeClaim:
claimName: immich-upload-pvc
- name: gphoto-storage
nfs:
server: nas.homenet
path: /mnt/storage/Storage/k8s/immich/GPHOTO/
readOnly: false
persistentVolumeClaim:
claimName: immich-gphoto-pvc
- name: camera
nfs:
server: nas.homenet
path: /mnt/storage/Storage/Syncthing-repos/PhoneCamera/
persistentVolumeClaim:
claimName: immich-camera-pvc
readOnly: true
- name: localtime
hostPath:

View File

@@ -1,79 +1,52 @@
---
apiVersion: v1
kind: PersistentVolume
metadata:
name: immich-upload-pv
spec:
capacity:
storage: 500Gi
accessModes:
- ReadWriteOnce
hostPath:
path: /mnt/storage/Storage/k8s/immich/library
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: immich-upload-pvc
namespace: immich
spec:
storageClassName: ""
accessModes:
- ReadWriteOnce
volumeName: immich-upload-pv
- ReadWriteMany
storageClassName: nfs-csi
resources:
requests:
storage: 500Gi
---
apiVersion: v1
kind: PersistentVolume
metadata:
name: immich-gphoto-pv
spec:
capacity:
storage: 500Gi
accessModes:
- ReadWriteOnce
hostPath:
path: /mnt/storage/Storage/k8s/immich/GPHOTO
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: immich-gphoto-pvc
namespace: immich
spec:
storageClassName: ""
accessModes:
- ReadWriteOnce
volumeName: immich-gphoto-pv
- ReadWriteMany
storageClassName: nfs-csi
resources:
requests:
storage: 500Gi
---
apiVersion: v1
kind: PersistentVolume
metadata:
name: immich-db-pv
spec:
capacity:
storage: 10Gi
accessModes:
- ReadWriteOnce
hostPath:
path: /mnt/storage/Storage/k8s/immich/db
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: immich-db-pvc
namespace: immich
spec:
storageClassName: ""
accessModes:
- ReadWriteOnce
volumeName: immich-db-pv
- ReadWriteMany
storageClassName: nfs-csi
resources:
requests:
storage: 10Gi
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: immich-camera-pvc
namespace: immich
spec:
accessModes:
- ReadOnlyMany
storageClassName: nfs-csi
resources:
requests:
storage: 100Gi

View File

@@ -19,7 +19,7 @@ spec:
kubernetes.io/os: linux
containers:
- name: secret-reader
image: ultradesu/k8s-secrets:0.1.1
image: ultradesu/k8s-secrets:0.2.1
imagePullPolicy: Always
args:
- "--secrets"
@@ -28,6 +28,7 @@ spec:
- "k8s-secret"
- "--port"
- "3000"
- "--webhook"
ports:
- containerPort: 3000
name: http

21
k8s/apps/n8n/app.yaml Normal file
View File

@@ -0,0 +1,21 @@
apiVersion: argoproj.io/v1alpha1
kind: Application
metadata:
name: n8n
namespace: argocd
spec:
project: apps
destination:
namespace: n8n
server: https://kubernetes.default.svc
source:
repoURL: ssh://git@gt.hexor.cy:30022/ab/homelab.git
targetRevision: HEAD
path: k8s/apps/n8n
syncPolicy:
automated:
selfHeal: true
prune: true
syncOptions:
- CreateNamespace=true

View File

@@ -0,0 +1,37 @@
---
apiVersion: external-secrets.io/v1
kind: ExternalSecret
metadata:
name: credentials
spec:
target:
name: credentials
deletionPolicy: Delete
template:
type: Opaque
data:
postgres-password: "{{ .psql | trim }}"
N8N_ENCRYPTION_KEY: "{{ .enc_pass | trim }}"
data:
- secretKey: psql
sourceRef:
storeRef:
name: vaultwarden-login
kind: ClusterSecretStore
remoteRef:
conversionStrategy: Default
decodingStrategy: None
metadataPolicy: None
key: 2a9deb39-ef22-433e-a1be-df1555625e22
property: fields[13].value
- secretKey: enc_pass
sourceRef:
storeRef:
name: vaultwarden-login
kind: ClusterSecretStore
remoteRef:
conversionStrategy: Default
decodingStrategy: None
metadataPolicy: None
key: 18c92d73-9637-4419-8642-7f7b308460cb
property: fields[0].value

View File

@@ -0,0 +1,22 @@
apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
resources:
- external-secrets.yaml
- storage.yaml
helmCharts:
- name: n8n
repo: https://community-charts.github.io/helm-charts
version: 1.16.28
releaseName: n8n
namespace: n8n
valuesFile: values-n8n.yaml
includeCRDs: true
- name: searxng
repo: https://unknowniq.github.io/helm-charts/
version: 0.1.3
releaseName: searxng
namespace: n8n
valuesFile: values-searxng.yaml
includeCRDs: true

12
k8s/apps/n8n/storage.yaml Normal file
View File

@@ -0,0 +1,12 @@
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: n8n-home
spec:
accessModes:
- ReadWriteMany
storageClassName: nfs-csi
resources:
requests:
storage: 10Gi

View File

@@ -0,0 +1,53 @@
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net
db:
type: postgresdb
main:
resources:
requests:
cpu: 100m
memory: 128Mi
limits:
cpu: 512m
memory: 512Mi
persistence:
enabled: true
existingClaim: n8n-home
mountPath: /home/node/.n8n
worker:
mode: regular
webhook:
url: https://n8n.hexor.cy
redis:
enabled: true
existingEncryptionKeySecret: credentials
externalPostgresql:
existingSecret: credentials
host: "psql.psql.svc"
username: "n8n"
database: "n8n"
ingress:
enabled: true
className: traefik
annotations:
cert-manager.io/cluster-issuer: letsencrypt
traefik.ingress.kubernetes.io/router.middlewares: kube-system-https-redirect@kubernetescrd
hosts:
- host: n8n.hexor.cy
paths:
- path: /
pathType: Prefix
tls:
- secretName: n8n-tls
hosts:
- '*.hexor.cy'

View File

@@ -0,0 +1,24 @@
config:
general:
instance_name: "HexorSearXNG"
debug: true
server:
limiter: false
public_instance: false
method: "POST"
search:
safe_search: 0
extraConfig:
botdetection:
ip_lists:
pass_ip:
- '0.0.0.0/0'
- '::0/0'
ip_limit:
filter_link_local: false
link_token: false
valkey:
enabled: true
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net

View File

@@ -4,6 +4,7 @@ kind: Kustomization
resources:
- app.yaml
- external-secrets.yaml
- paperless-ai.yaml
helmCharts:
- name: paperless-ngx

View File

@@ -0,0 +1,101 @@
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: paperless-ai
labels:
app: paperless-ai
spec:
replicas: 1
selector:
matchLabels:
app: paperless-ai
template:
metadata:
labels:
app: paperless-ai
spec:
nodeSelector:
kubernetes.io/hostname: nas.homenet
containers:
- name: paperless-ai
image: clusterzx/paperless-ai:latest
imagePullPolicy: Always
ports:
- containerPort: 3000
name: http
env:
- name: NODE_ENV
value: production
- name: PAPERLESS_AI_PORT
value: "3000"
resources:
requests:
memory: 512Mi
cpu: 500m
limits:
memory: 1024Mi
cpu: 2000m
#livenessProbe:
# httpGet:
# path: /
# port: 8000
# initialDelaySeconds: 30
# periodSeconds: 10
#readinessProbe:
# httpGet:
# path: /
# port: 8000
# initialDelaySeconds: 5
# periodSeconds: 5
volumeMounts:
- name: data
mountPath: /app/data
volumes:
- name: data
hostPath:
path: /mnt/storage/Storage/k8s/paperless/ai-data
type: DirectoryOrCreate
---
apiVersion: v1
kind: Service
metadata:
name: paperless-ai
namespace: paperless
labels:
app: paperless-ai
spec:
type: ClusterIP
ports:
- port: 3000
targetPort: 3000
protocol: TCP
name: http
selector:
app: paperless-ai
---
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: paperless-ai-ingress
annotations:
ingressClassName: traefik
cert-manager.io/cluster-issuer: letsencrypt
traefik.ingress.kubernetes.io/router.middlewares: kube-system-https-redirect@kubernetescrd
acme.cert-manager.io/http01-edit-in-place: "true"
spec:
rules:
- host: ai-docs.hexor.cy
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: paperless-ai
port:
number: 3000
tls:
- secretName: docs-tls
hosts:
- '*.hexor.cy'

View File

@@ -1,5 +1,5 @@
image:
tag: 2.19.3
tag: 2.20.3
resources:
requests:
memory: "1Gi"

View File

@@ -1,212 +0,0 @@
---
apiVersion: v1
kind: ConfigMap
metadata:
name: pasarguard-scripts-ingress
labels:
app: pasarguard-node-ingress
data:
init-uuid-ingress.sh: |
#!/bin/bash
set -e
echo "Started"
# NODE_NAME is already set via environment variable
NAMESPACE=$(cat /var/run/secrets/kubernetes.io/serviceaccount/namespace)
# Get DNS name from node label xray-public-address
DNS_NAME=$(kubectl get node "${NODE_NAME}" -o jsonpath='{.metadata.labels.xray-public-address}')
if [ -z "${DNS_NAME}" ]; then
echo "ERROR: Node ${NODE_NAME} does not have label 'xray-public-address'"
exit 1
fi
echo "Node: ${NODE_NAME}"
echo "DNS Name from label: ${DNS_NAME}"
# Use DNS name for ConfigMap name to ensure uniqueness
CONFIGMAP_NAME="node-uuid-ingress-${DNS_NAME//./-}"
echo "Checking ConfigMap: ${CONFIGMAP_NAME}"
# Check if ConfigMap exists and get UUID
if kubectl get configmap "${CONFIGMAP_NAME}" -n "${NAMESPACE}" &>/dev/null; then
echo "ConfigMap exists, reading UUID..."
API_KEY=$(kubectl get configmap "${CONFIGMAP_NAME}" -n "${NAMESPACE}" -o jsonpath='{.data.API_KEY}')
if [ -z "${API_KEY}" ]; then
echo "UUID not found in ConfigMap, generating new one..."
API_KEY=$(cat /proc/sys/kernel/random/uuid)
kubectl patch configmap "${CONFIGMAP_NAME}" -n "${NAMESPACE}" --type merge -p "{\"data\":{\"API_KEY\":\"${API_KEY}\"}}"
else
echo "Using existing UUID from ConfigMap"
fi
else
echo "ConfigMap does not exist, creating new one..."
API_KEY=$(cat /proc/sys/kernel/random/uuid)
kubectl create configmap "${CONFIGMAP_NAME}" -n "${NAMESPACE}" \
--from-literal=API_KEY="${API_KEY}" \
--from-literal=NODE_NAME="${NODE_NAME}"
fi
# Save UUID and node info to shared volume for the main container
echo -n "${API_KEY}" > /shared/api-key
echo -n "${NODE_NAME}" > /shared/node-name
echo -n "${CONFIGMAP_NAME}" > /shared/configmap-name
echo "UUID initialized: ${API_KEY}"
echo "Node name: ${NODE_NAME}"
echo "ConfigMap: ${CONFIGMAP_NAME}"
# Create Certificate for this node using DNS name from label
CERT_NAME="pasarguard-node-ingress-${DNS_NAME//./-}"
echo "Creating Certificate: ${CERT_NAME} for ${DNS_NAME}"
# Check if Certificate already exists
if ! kubectl get certificate "${CERT_NAME}" -n "${NAMESPACE}" &>/dev/null; then
echo "Certificate does not exist, creating..."
cat <<EOF | kubectl apply -f -
apiVersion: cert-manager.io/v1
kind: Certificate
metadata:
name: ${CERT_NAME}
namespace: ${NAMESPACE}
spec:
secretName: ${CERT_NAME}-tls
issuerRef:
name: letsencrypt
kind: ClusterIssuer
dnsNames:
- ${DNS_NAME}
EOF
else
echo "Certificate already exists"
fi
# Wait for certificate to be ready
echo "Waiting for certificate to be ready..."
for i in {1..600}; do
if kubectl get secret "${CERT_NAME}-tls" -n "${NAMESPACE}" &>/dev/null; then
echo "Certificate secret is ready!"
break
fi
echo "Waiting for certificate... ($i/600)"
sleep 1
done
if ! kubectl get secret "${CERT_NAME}-tls" -n "${NAMESPACE}" &>/dev/null; then
echo "WARNING: Certificate secret not ready after 600 seconds"
else
# Extract certificate and key from secret to shared volume
echo "Extracting certificate and key..."
kubectl get secret "${CERT_NAME}-tls" -n "${NAMESPACE}" -o jsonpath='{.data.tls\.crt}' | base64 -d > /shared/tls.crt
kubectl get secret "${CERT_NAME}-tls" -n "${NAMESPACE}" -o jsonpath='{.data.tls\.key}' | base64 -d > /shared/tls.key
echo "Certificate and key extracted successfully."
cat /shared/tls.crt
fi
# Create ClusterIP Service for this node (pod selector based)
NODE_SHORT_NAME="${NODE_NAME%%.*}"
SERVICE_NAME="${NODE_SHORT_NAME}-ingress"
echo "Creating Service: ${SERVICE_NAME} for node ${NODE_NAME} (short: ${NODE_SHORT_NAME})"
# Create Service with pod selector including node name
cat <<EOF | kubectl apply -f -
apiVersion: v1
kind: Service
metadata:
name: ${SERVICE_NAME}
namespace: ${NAMESPACE}
labels:
app: pasarguard-node-ingress
node: ${NODE_NAME}
spec:
type: ClusterIP
selector:
app: pasarguard-node-ingress
node-name: ${NODE_SHORT_NAME}
ports:
- name: proxy
port: 443
protocol: TCP
targetPort: 443
- name: api
port: 62050
protocol: TCP
targetPort: 62050
EOF
echo "Service created: ${SERVICE_NAME}.${NAMESPACE}.svc.cluster.local"
# Create IngressRouteTCP for this DNS name with TLS passthrough
INGRESS_NAME="pasarguard-tcp-${DNS_NAME//./-}"
echo "Creating IngressRouteTCP: ${INGRESS_NAME} for ${DNS_NAME}"
cat <<EOF | kubectl apply -f -
apiVersion: traefik.io/v1alpha1
kind: IngressRouteTCP
metadata:
name: ${INGRESS_NAME}
namespace: ${NAMESPACE}
labels:
app: pasarguard-node-ingress
node: ${NODE_NAME}
spec:
entryPoints:
- websecure
routes:
- match: HostSNI(\`${DNS_NAME}\`)
services:
- name: ${SERVICE_NAME}
port: 443
tls:
passthrough: true
EOF
echo "IngressRouteTCP created: ${INGRESS_NAME}"
echo "Traffic to ${DNS_NAME}:443 will be routed to ${SERVICE_NAME}:443"
# Create second IngressRouteTCP for API port 62051
INGRESS_API_NAME="pasarguard-api-${DNS_NAME//./-}"
echo "Creating IngressRouteTCP for API: ${INGRESS_API_NAME} for ${DNS_NAME}:62051"
cat <<EOF | kubectl apply -f -
apiVersion: traefik.io/v1alpha1
kind: IngressRouteTCP
metadata:
name: ${INGRESS_API_NAME}
namespace: ${NAMESPACE}
labels:
app: pasarguard-node-ingress
node: ${NODE_NAME}
spec:
entryPoints:
- pasarguard-api
routes:
- match: HostSNI(\`${DNS_NAME}\`)
services:
- name: ${SERVICE_NAME}
port: 62050
tls:
passthrough: true
EOF
echo "IngressRouteTCP API created: ${INGRESS_API_NAME}"
echo "Traffic to ${DNS_NAME}:62051 will be routed to ${SERVICE_NAME}:62050"
pasarguard-start.sh: |
#!/bin/sh
# Read API_KEY from shared volume created by init container
if [ -f /shared/api-key ]; then
export API_KEY=$(cat /shared/api-key)
echo "Loaded API_KEY from shared volume"
else
echo "WARNING: API_KEY file not found, using default"
fi
cd /app
exec ./main

View File

@@ -1,211 +0,0 @@
---
apiVersion: v1
kind: ServiceAccount
metadata:
name: pasarguard-node-ingress
labels:
app: pasarguard-node-ingress
---
apiVersion: rbac.authorization.k8s.io/v1
kind: Role
metadata:
name: pasarguard-node-ingress-configmap
labels:
app: pasarguard-node-ingress
rules:
- apiGroups: [""]
resources: ["configmaps"]
verbs: ["get", "list", "create", "update", "patch"]
- apiGroups: ["cert-manager.io"]
resources: ["certificates"]
verbs: ["get", "list", "create", "update", "patch", "delete"]
- apiGroups: [""]
resources: ["secrets"]
verbs: ["get", "list"]
- apiGroups: [""]
resources: ["services", "endpoints"]
verbs: ["get", "list", "create", "update", "patch", "delete"]
- apiGroups: ["traefik.io", "traefik.containo.us"]
resources: ["ingressroutetcps"]
verbs: ["get", "list", "create", "update", "patch", "delete"]
- apiGroups: [""]
resources: ["pods"]
verbs: ["get", "list", "patch", "update"]
---
apiVersion: rbac.authorization.k8s.io/v1
kind: RoleBinding
metadata:
name: pasarguard-node-ingress-configmap
labels:
app: pasarguard-node-ingress
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: Role
name: pasarguard-node-ingress-configmap
subjects:
- kind: ServiceAccount
name: pasarguard-node-ingress
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole
metadata:
name: pasarguard-node-ingress-reader
labels:
app: pasarguard-node-ingress
rules:
- apiGroups: [""]
resources: ["nodes"]
verbs: ["get", "list"]
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: pasarguard-node-ingress-reader
labels:
app: pasarguard-node-ingress
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: pasarguard-node-ingress-reader
subjects:
- kind: ServiceAccount
name: pasarguard-node-ingress
namespace: pasarguard
---
apiVersion: apps/v1
kind: DaemonSet
metadata:
name: pasarguard-node-ingress
labels:
app: pasarguard-node-ingress
spec:
selector:
matchLabels:
app: pasarguard-node-ingress
revisionHistoryLimit: 3
updateStrategy:
type: RollingUpdate
template:
metadata:
labels:
app: pasarguard-node-ingress
spec:
serviceAccountName: pasarguard-node-ingress
affinity:
nodeAffinity:
requiredDuringSchedulingIgnoredDuringExecution:
nodeSelectorTerms:
- matchExpressions:
- key: xray-public-address
operator: Exists
initContainers:
- name: label-pod
image: bitnami/kubectl:latest
env:
- name: POD_NAME
valueFrom:
fieldRef:
fieldPath: metadata.name
- name: POD_NAMESPACE
valueFrom:
fieldRef:
fieldPath: metadata.namespace
- name: NODE_NAME
valueFrom:
fieldRef:
fieldPath: spec.nodeName
command:
- /bin/bash
- -c
- |
# Add node label to pod
NODE_SHORT=$(echo ${NODE_NAME} | cut -d. -f1)
kubectl label pod ${POD_NAME} -n ${POD_NAMESPACE} node-name=${NODE_SHORT} --overwrite
- name: init-uuid
image: bitnami/kubectl:latest
env:
- name: GODEBUG
value: "x509sha1=1"
- name: NODE_NAME
valueFrom:
fieldRef:
fieldPath: spec.nodeName
command:
- /bin/bash
- /scripts/init-uuid-ingress.sh
volumeMounts:
- name: shared-data
mountPath: /shared
- name: scripts
mountPath: /scripts
containers:
- name: pasarguard-node
image: 'pasarguard/node:v0.1.3'
imagePullPolicy: Always
command:
- /bin/sh
- /scripts/pasarguard-start.sh
ports:
- name: api
containerPort: 62050
protocol: TCP
- name: proxy
containerPort: 443
protocol: TCP
env:
- name: NODE_NAME
valueFrom:
fieldRef:
fieldPath: spec.nodeName
- name: NODE_HOST
value: "0.0.0.0"
- name: SERVICE_PORT
value: "62050"
- name: SERVICE_PROTOCOL
value: "grpc"
- name: DEBUG
value: "true"
- name: SSL_CERT_FILE
value: "/shared/tls.crt"
- name: SSL_KEY_FILE
value: "/shared/tls.key"
- name: XRAY_EXECUTABLE_PATH
value: "/usr/local/bin/xray"
- name: XRAY_ASSETS_PATH
value: "/usr/local/share/xray"
- name: API_KEY
value: "change-this-to-a-secure-uuid"
livenessProbe:
tcpSocket:
port: 62050
initialDelaySeconds: 30
periodSeconds: 10
timeoutSeconds: 5
failureThreshold: 3
readinessProbe:
tcpSocket:
port: 62050
initialDelaySeconds: 10
periodSeconds: 5
timeoutSeconds: 3
failureThreshold: 3
resources:
requests:
memory: "128Mi"
cpu: "100m"
limits:
memory: "512Mi"
cpu: "750m"
volumeMounts:
- name: shared-data
mountPath: /shared
readOnly: false
- name: scripts
mountPath: /scripts
volumes:
- name: shared-data
emptyDir: {}
- name: scripts
configMap:
name: pasarguard-scripts-ingress
defaultMode: 0755

View File

@@ -113,7 +113,7 @@ spec:
mountPath: /scripts
containers:
- name: pasarguard-node
image: 'pasarguard/node:v0.1.3'
image: 'pasarguard/node:v0.2.1'
imagePullPolicy: Always
command:
- /bin/sh
@@ -162,10 +162,10 @@ spec:
resources:
requests:
memory: "128Mi"
cpu: "100m"
#cpu: "500m"
limits:
memory: "512Mi"
cpu: "750m"
#cpu: "1200m"
volumeMounts:
- name: shared-data
mountPath: /shared
@@ -205,7 +205,7 @@ spec:
cpu: "50m"
limits:
memory: "128Mi"
cpu: "150m"
cpu: "500m"
volumeMounts:
- name: shared-data
mountPath: /shared

View File

@@ -9,6 +9,3 @@ resources:
- ./certificate.yaml
- ./configmap-scripts.yaml
- ./servicemonitor.yaml
- ./configmap-scripts-ingress.yaml
# - ./daemonset-ingress.yaml
# - ./traefik-pasarguard-entrypoint.yaml

View File

@@ -1,66 +0,0 @@
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: traefik
namespace: kube-system
spec:
template:
spec:
containers:
- name: traefik
args:
- --entryPoints.metrics.address=:9100/tcp
- --entryPoints.traefik.address=:8080/tcp
- --entryPoints.web.address=:8000/tcp
- --entryPoints.websecure.address=:8443/tcp
- --entryPoints.pasarguard-api.address=:62051/tcp
- --api.dashboard=true
- --ping=true
- --metrics.prometheus=true
- --metrics.prometheus.entrypoint=metrics
- --providers.kubernetescrd
- --providers.kubernetescrd.allowEmptyServices=true
- --providers.kubernetesingress
- --providers.kubernetesingress.allowEmptyServices=true
- --providers.kubernetesingress.ingressendpoint.publishedservice=kube-system/traefik
- --entryPoints.websecure.http.tls=true
- --log.level=INFO
- --entryPoints.web.transport.respondingTimeouts.readTimeout=0s
- --entryPoints.websecure.transport.respondingTimeouts.readTimeout=0s
ports:
- containerPort: 9100
name: metrics
protocol: TCP
- containerPort: 8080
name: traefik
protocol: TCP
- containerPort: 8000
name: web
protocol: TCP
- containerPort: 8443
name: websecure
protocol: TCP
- containerPort: 62051
name: pasarguard-api
protocol: TCP
---
apiVersion: v1
kind: Service
metadata:
name: traefik
namespace: kube-system
spec:
ports:
- name: web
port: 80
protocol: TCP
targetPort: web
- name: websecure
port: 443
protocol: TCP
targetPort: websecure
- name: pasarguard-api
port: 62051
protocol: TCP
targetPort: pasarguard-api

View File

@@ -16,18 +16,18 @@ helmCharts:
valuesFile: syncthing-master.yaml
includeCRDs: true
- name: syncthing
repo: https://k8s-home-lab.github.io/helm-charts
version: 4.0.0
releaseName: syncthing-khv
namespace: syncthing
valuesFile: syncthing-khv.yaml
includeCRDs: true
- name: syncthing
repo: https://k8s-home-lab.github.io/helm-charts
version: 4.0.0
releaseName: syncthing-nas
namespace: syncthing
valuesFile: syncthing-nas.yaml
includeCRDs: true
includeCRDs: true
# - name: syncthing
# repo: https://k8s-home-lab.github.io/helm-charts
# version: 4.0.0
# releaseName: syncthing-khv
# namespace: syncthing
# valuesFile: syncthing-khv.yaml
# includeCRDs: true

View File

@@ -1,4 +1,3 @@
---
apiVersion: apps/v1
kind: Deployment
metadata:
@@ -23,7 +22,7 @@ spec:
kubernetes.io/hostname: home.homenet
containers:
- name: desubot
image: 'ultradesu/desubot:latest'
image: "ultradesu/desubot:latest"
imagePullPolicy: Always
envFrom:
- secretRef:
@@ -32,11 +31,11 @@ spec:
- name: RUST_LOG
value: "info"
volumeMounts:
- mountPath: /storage
name: storage
- mountPath: /storage
name: storage
volumes:
- name: storage
nfs:
server: nas.homenet
path: /mnt/storage/Storage/k8s/desubot/
readOnly: false
persistentVolumeClaim:
claimName: desubot-storage
readOnly: false

View File

@@ -8,3 +8,5 @@ resources:
- external-secrets.yaml
- desubot.yaml
- restart-job.yaml
- storage.yaml

View File

@@ -0,0 +1,12 @@
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: desubot-storage
spec:
accessModes:
- ReadWriteMany
storageClassName: nfs-csi
resources:
requests:
storage: 200Gi

View File

@@ -2,7 +2,7 @@
global:
domain: ag.hexor.cy
nodeSelector:
nodeSelector: &nodeSelector
kubernetes.io/hostname: master.tail2fe2d.ts.net
logging:
format: text
@@ -55,15 +55,15 @@ configs:
controller:
replicas: 1
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net
nodeSelector:
<<: *nodeSelector
# Add resources (requests/limits), PDB etc. if needed
# Dex OIDC provider
dex:
replicas: 1
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net
<<: *nodeSelector
enabled: false
# Standard Redis disabled because Redis HA is enabled
@@ -86,7 +86,7 @@ redis-ha:
server:
replicas: 1
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net
<<: *nodeSelector
ingress:
enabled: false
@@ -99,8 +99,11 @@ server:
# Repository Server
repoServer:
replicas: 1
livenessProbe:
timeoutSeconds: 10
periodSeconds: 60
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net
<<: *nodeSelector
# Add resources (requests/limits), PDB etc. if needed
# ApplicationSet Controller
@@ -108,7 +111,7 @@ applicationSet:
enabled: true # Enabled by default
replicas: 1
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net
<<: *nodeSelector
# Add resources (requests/limits), PDB etc. if needed
# Notifications Controller
@@ -116,5 +119,5 @@ notifications:
enabled: true # Enabled by default
replicas: 1
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net
<<: *nodeSelector
# Add notifiers, triggers, templates configurations if needed

View File

@@ -35,5 +35,6 @@ spec:
key: secretKey
selector:
dnsZones:
- "ps.hexor.cy"
- "of.hexor.cy"

View File

@@ -1,6 +1,6 @@
FROM debian:sid
ENV BW_CLI_VERSION=2025.5.0
ENV BW_CLI_VERSION=2025.12.1
RUN apt update && \
apt install -y wget unzip && \

View File

@@ -37,15 +37,15 @@ spec:
kubernetes.io/hostname: master.tail2fe2d.ts.net
containers:
- name: bitwarden-cli
image: ultradesu/bitwarden-client:2025.5.0
image: ultradesu/bitwarden-client:2025.12.1
imagePullPolicy: Always
resources:
requests:
memory: "128Mi"
cpu: "100m"
cpu: "300m"
limits:
memory: "512Mi"
cpu: "500m"
cpu: "1000m"
env:
- name: BW_HOST
valueFrom:

View File

@@ -3,5 +3,15 @@ kind: Kustomization
resources:
- app.yaml
- nfs-storage.yaml
- coredns-internal-resolve.yaml
helmCharts:
- name: csi-driver-nfs
repo: https://raw.githubusercontent.com/kubernetes-csi/csi-driver-nfs/master/charts
version: 4.12.0
releaseName: csi-driver-nfs
namespace: kube-system
#valuesFile: values.yaml
includeCRDs: true

View File

@@ -0,0 +1,14 @@
---
apiVersion: storage.k8s.io/v1
kind: StorageClass
metadata:
name: nfs-csi
provisioner: nfs.csi.k8s.io
parameters:
server: nas.homenet
share: /mnt/storage/Storage/PVC
reclaimPolicy: Retain
volumeBindingMode: Immediate
mountOptions:
- vers=4
- hard

View File

@@ -125,6 +125,8 @@ spec:
{{ .umami }}
USER_mmdl: |-
{{ .mmdl }}
USER_n8n: |-
{{ .n8n }}
data:
- secretKey: authentik
sourceRef:
@@ -258,3 +260,15 @@ spec:
metadataPolicy: None
key: 2a9deb39-ef22-433e-a1be-df1555625e22
property: fields[12].value
- secretKey: n8n
sourceRef:
storeRef:
name: vaultwarden-login
kind: ClusterSecretStore
remoteRef:
conversionStrategy: Default
decodingStrategy: None
metadataPolicy: None
key: 2a9deb39-ef22-433e-a1be-df1555625e22
property: fields[13].value

View File

@@ -13,9 +13,6 @@ spec:
targetRevision: HEAD
path: k8s/core/prom-stack
syncPolicy:
automated:
selfHeal: true
prune: true
syncOptions:
- CreateNamespace=true
- ServerSideApply=true

View File

@@ -79,3 +79,83 @@ spec:
key: 2a9deb39-ef22-433e-a1be-df1555625e22
property: fields[2].value
---
apiVersion: external-secrets.io/v1
kind: ExternalSecret
metadata:
name: alertmanager-telegram
spec:
target:
name: alertmanager-telegram-secret
deletionPolicy: Delete
template:
type: Opaque
data:
TELEGRAM_BOT_TOKEN: |-
{{ .bot_token }}
TELEGRAM_CHAT_ID: |-
{{ .chat_id }}
data:
- secretKey: bot_token
sourceRef:
storeRef:
name: vaultwarden-login
kind: ClusterSecretStore
remoteRef:
conversionStrategy: Default
decodingStrategy: None
metadataPolicy: None
key: eca0fb0b-3939-40a8-890a-6294863e5a65
property: fields[0].value
- secretKey: chat_id
sourceRef:
storeRef:
name: vaultwarden-login
kind: ClusterSecretStore
remoteRef:
conversionStrategy: Default
decodingStrategy: None
metadataPolicy: None
key: eca0fb0b-3939-40a8-890a-6294863e5a65
property: fields[1].value
---
apiVersion: external-secrets.io/v1
kind: ExternalSecret
metadata:
name: grafana-telegram
spec:
target:
name: grafana-telegram
deletionPolicy: Delete
template:
type: Opaque
data:
bot-token: |-
{{ .bot_token }}
chat-id: |-
{{ .chat_id }}
data:
- secretKey: bot_token
sourceRef:
storeRef:
name: vaultwarden-login
kind: ClusterSecretStore
remoteRef:
conversionStrategy: Default
decodingStrategy: None
metadataPolicy: None
key: eca0fb0b-3939-40a8-890a-6294863e5a65
property: fields[0].value
- secretKey: chat_id
sourceRef:
storeRef:
name: vaultwarden-login
kind: ClusterSecretStore
remoteRef:
conversionStrategy: Default
decodingStrategy: None
metadataPolicy: None
key: eca0fb0b-3939-40a8-890a-6294863e5a65
property: fields[1].value

View File

@@ -0,0 +1,152 @@
apiVersion: v1
kind: ConfigMap
metadata:
name: grafana-alerting
namespace: prometheus
data:
rules.yaml: |
apiVersion: 1
groups:
- orgId: 1
name: pasarguard_alerts
folder: Kubernetes
interval: 1m
rules:
- uid: pasarguard_cpu_throttling
title: VPN CPU Throttle
condition: B
data:
- refId: A
relativeTimeRange:
from: 600
to: 0
datasourceUid: P76F38748CEC837F0
model:
expr: 'rate(container_cpu_cfs_throttled_periods_total{container="pasarguard-node"}[5m])'
refId: A
intervalMs: 1000
maxDataPoints: 43200
- refId: B
relativeTimeRange:
from: 600
to: 0
datasourceUid: __expr__
model:
conditions:
- evaluator:
params:
- 0.1
type: gt
operator:
type: and
query:
params: []
datasource:
type: __expr__
uid: __expr__
expression: A
reducer: last
refId: B
type: reduce
noDataState: NoData
execErrState: Alerting
for: 5m
annotations:
pod: '{{ $labels.pod }}'
node: '{{ $labels.node }}'
namespace: '{{ $labels.namespace }}'
throttle_rate: '{{ printf "%.2f" $values.A }}'
summary: 'VPN node throttling CPU'
labels:
severity: warning
- orgId: 1
name: kubernetes_alerts
folder: Kubernetes
interval: 30s
rules:
- uid: node_not_ready
title: Kubernetes Node Not Ready
condition: B
data:
- refId: A
relativeTimeRange:
from: 300
to: 0
datasourceUid: P76F38748CEC837F0
model:
expr: 'kube_node_status_condition{condition="Ready",status="true"} == 0'
refId: A
intervalMs: 1000
maxDataPoints: 43200
- refId: B
relativeTimeRange:
from: 300
to: 0
datasourceUid: __expr__
model:
conditions:
- evaluator:
params:
- 0
type: gt
operator:
type: and
query:
params: []
datasource:
type: __expr__
uid: __expr__
expression: A
reducer: last
refId: B
type: reduce
noDataState: Alerting
execErrState: Alerting
for: 0s
annotations:
node: '{{ $labels.node }}'
condition: '{{ $labels.condition }}'
summary: 'Kubernetes node is not ready'
labels:
severity: critical
contactpoints.yaml: |
apiVersion: 1
contactPoints:
- orgId: 1
name: telegram
receivers:
- uid: telegram_default
type: telegram
disableResolveMessage: false
settings:
bottoken: $TELEGRAM_BOT_TOKEN
chatid: "124317807"
message: |
{{ if eq .Status "firing" }}🔥 FIRING{{ else }}✅ RESOLVED{{ end }}
{{ range .Alerts }}
📊 <b>{{ .Labels.alertname }}</b>
{{ .Annotations.summary }}
{{ if .Annotations.node }}🖥 <b>Node:</b> <code>{{ .Annotations.node }}</code>{{ end }}
{{ if .Annotations.pod }}📦 <b>Pod:</b> <code>{{ .Annotations.pod }}</code>{{ end }}
{{ if .Annotations.namespace }}📁 <b>Namespace:</b> <code>{{ .Annotations.namespace }}</code>{{ end }}
{{ if .Annotations.throttle_rate }}⚠️ <b>Throttling rate:</b> {{ .Annotations.throttle_rate }}{{ end }}
🔗 <a href="{{ .GeneratorURL }}">View in Grafana</a>
{{ end }}
parse_mode: HTML
policies.yaml: |
apiVersion: 1
policies:
- orgId: 1
receiver: telegram
group_by:
- grafana_folder
- alertname
group_wait: 10s
group_interval: 5m
repeat_interval: 4h

View File

@@ -38,6 +38,10 @@ datasources:
url: http://prometheus-kube-prometheus-prometheus.prometheus.svc:9090
access: proxy
isDefault: true
- name: Loki
type: loki
url: http://loki-gateway.prometheus.svc:80
access: proxy
ingress:
enabled: true
@@ -52,3 +56,30 @@ ingress:
hosts:
- '*.hexor.cy'
extraConfigmapMounts:
- name: grafana-alerting-rules
mountPath: /etc/grafana/provisioning/alerting/rules.yaml
configMap: grafana-alerting
subPath: rules.yaml
readOnly: true
- name: grafana-alerting-contactpoints
mountPath: /etc/grafana/provisioning/alerting/contactpoints.yaml
configMap: grafana-alerting
subPath: contactpoints.yaml
readOnly: true
- name: grafana-alerting-policies
mountPath: /etc/grafana/provisioning/alerting/policies.yaml
configMap: grafana-alerting
subPath: policies.yaml
readOnly: true
envValueFrom:
TELEGRAM_BOT_TOKEN:
secretKeyRef:
name: grafana-telegram
key: bot-token
TELEGRAM_CHAT_ID:
secretKeyRef:
name: grafana-telegram
key: chat-id

View File

@@ -2,9 +2,9 @@ apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
resources:
- app.yaml
- persistentVolume.yaml
- external-secrets.yaml
- grafana-alerting-configmap.yaml
helmCharts:
- name: kube-prometheus-stack
@@ -23,3 +23,18 @@ helmCharts:
valuesFile: grafana-values.yaml
includeCRDs: true
- name: loki
repo: https://grafana.github.io/helm-charts
version: 6.29.0
releaseName: loki
namespace: prometheus
valuesFile: loki-values.yaml
includeCRDs: true
- name: promtail
repo: https://grafana.github.io/helm-charts
version: 6.16.6
releaseName: promtail
namespace: prometheus
valuesFile: promtail-values.yaml

View File

@@ -0,0 +1,75 @@
# Loki SingleBinary mode - optimal for homelab
deploymentMode: SingleBinary
loki:
auth_enabled: false
commonConfig:
replication_factor: 1
path_prefix: /var/loki
schemaConfig:
configs:
- from: 2024-01-01
store: tsdb
object_store: filesystem
schema: v13
index:
prefix: index_
period: 24h
storage:
type: filesystem
filesystem:
chunks_directory: /var/loki/chunks
rules_directory: /var/loki/rules
limits_config:
reject_old_samples: false
ingestion_rate_mb: 16
ingestion_burst_size_mb: 32
max_query_parallelism: 32
volume_enabled: true
singleBinary:
replicas: 1
nodeSelector:
kubernetes.io/hostname: master.tail2fe2d.ts.net
persistence:
enabled: true
size: 50Gi
storageClass: ""
# Disable distributed mode components
read:
replicas: 0
write:
replicas: 0
backend:
replicas: 0
# Disable memcached (not needed for SingleBinary)
chunksCache:
enabled: false
resultsCache:
enabled: false
# Gateway for Loki access
gateway:
enabled: true
replicas: 1
service:
type: ClusterIP
# Disable tests and canary
test:
enabled: false
lokiCanary:
enabled: false
# Monitoring
monitoring:
dashboards:
enabled: false
rules:
enabled: false
serviceMonitor:
enabled: false
selfMonitoring:
enabled: false

View File

@@ -1,5 +1,35 @@
grafana:
enabled: false
alertmanager:
config:
global:
telegram_api_url: "https://api.telegram.org"
route:
group_by: ['alertname', 'cluster', 'service']
group_wait: 10s
group_interval: 10s
repeat_interval: 12h
receiver: 'telegram'
receivers:
- name: 'telegram'
telegram_configs:
- bot_token: '${TELEGRAM_BOT_TOKEN}'
chat_id: ${TELEGRAM_CHAT_ID}
parse_mode: 'HTML'
message: |
{{ range .Alerts }}
<b>{{ .Labels.alertname }}</b>
{{ if .Labels.severity }}<b>Severity:</b> {{ .Labels.severity }}{{ end }}
<b>Status:</b> {{ .Status }}
{{ if .Annotations.summary }}<b>Summary:</b> {{ .Annotations.summary }}{{ end }}
{{ if .Annotations.description }}<b>Description:</b> {{ .Annotations.description }}{{ end }}
{{ end }}
alertmanagerSpec:
secrets:
- alertmanager-telegram-secret
prometheus:
prometheusSpec:
enableRemoteWriteReceiver: true

View File

@@ -0,0 +1,37 @@
# Promtail - log collection agent for all cluster pods
config:
clients:
- url: http://loki-gateway.prometheus.svc:80/loki/api/v1/push
# DaemonSet - runs on every node
daemonset:
enabled: true
# Tolerations for master/control-plane nodes
tolerations:
- key: node-role.kubernetes.io/master
operator: Exists
effect: NoSchedule
- key: node-role.kubernetes.io/control-plane
operator: Exists
effect: NoSchedule
# Init container to increase inotify limits
initContainer:
- name: init-inotify
image: docker.io/busybox:1.36
imagePullPolicy: IfNotPresent
command:
- sh
- -c
- sysctl -w fs.inotify.max_user_instances=512
securityContext:
privileged: true
resources:
requests:
cpu: 50m
memory: 64Mi
limits:
cpu: 200m
memory: 128Mi

View File

@@ -16,7 +16,7 @@ spec:
serviceAccountName: system-upgrade
upgrade:
image: rancher/k3s-upgrade
version: v1.34.2+k3s1
version: v1.34.3+k3s1
---
# Agent plan
apiVersion: upgrade.cattle.io/v1
@@ -39,5 +39,5 @@ spec:
serviceAccountName: system-upgrade
upgrade:
image: rancher/k3s-upgrade
version: v1.34.2+k3s1
version: v1.34.3+k3s1

View File

@@ -102,3 +102,22 @@ spec:
port: 80
targetPort: 8080
---
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: jf-local-ingress
annotations:
ingressClassName: traefik
spec:
rules:
- host: tr.uk
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: qbittorrent
port:
number: 80

View File

@@ -33,11 +33,8 @@ persistence:
ingress:
enabled: true
className: traefik
annotations:
cert-manager.io/cluster-issuer: letsencrypt
traefik.ingress.kubernetes.io/router.middlewares: kube-system-https-redirect@kubernetescrd
hosts:
- host: uk-desktop.uk
- host: jf.uk
paths:
- path: /
pathType: Prefix

View File

@@ -1,4 +1,3 @@
data "authentik_flow" "default_authorization_flow" {
slug = var.default_authorization_flow
}
@@ -299,7 +298,7 @@ resource "authentik_outpost" "outposts" {
kubernetes_ingress_class_name = null
kubernetes_disabled_components = []
kubernetes_ingress_annotations = {}
kubernetes_ingress_secret_name = "authentik-outpost-tls"
kubernetes_ingress_secret_name = "idm-tls"
})
depends_on = [

View File

@@ -51,6 +51,9 @@ proxy_applications = {
internal_host = "http://secret-reader.k8s-secret.svc:80"
internal_host_ssl_validation = false
meta_description = ""
skip_path_regex = <<-EOT
/webhook
EOT
meta_icon = "https://img.icons8.com/ios-filled/50/password.png"
mode = "proxy"
outpost = "kubernetes-outpost"