Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add pvcCleaner for events #3634

Closed
wants to merge 3 commits into from
Closed

Conversation

sumo-drosiek
Copy link
Contributor

Checklist

  • Changelog updated or skip changelog label added
  • Documentation updated
  • Template tests added for new features
  • Integration tests added or modified for major features

Signed-off-by: Dominik Rosiek <[email protected]>
@sumo-drosiek sumo-drosiek requested a review from a team as a code owner April 3, 2024 07:16
Signed-off-by: Dominik Rosiek <[email protected]>
Signed-off-by: Dominik Rosiek <[email protected]>
- "bash"
- "/pvc-cleaner/pvc-cleaner.sh"
- "{{ template "sumologic.namespace" . }}"
- "app={{ template "sumologic.labels.app.events.statefulset" . }}"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like adding a whole separate CronJob manifest just to change this one parameter. Can we use a tpl function instead?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually events differs more, as they do not have hpa

diff --git a/deploy/helm/sumologic/templates/pvc-cleaner/cron-job-metrics.yaml b/deploy/helm/sumologic/templates/pvc-cleaner/cron-job-events.yaml
index 4b37d360..6e929967 100644
--- a/deploy/helm/sumologic/templates/pvc-cleaner/cron-job-metrics.yaml
+++ b/deploy/helm/sumologic/templates/pvc-cleaner/cron-job-events.yaml
@@ -1,11 +1,11 @@
-{{- if eq .Values.pvcCleaner.metrics.enabled true }}
+{{- if eq .Values.pvcCleaner.events.enabled true }}
 apiVersion: batch/v1
 kind: CronJob
 metadata:
-  name: {{ template "sumologic.metadata.name.pvcCleaner.metrics" . }}
+  name: {{ template "sumologic.metadata.name.pvcCleaner.events" . }}
   namespace: {{ template "sumologic.namespace"  . }}
   labels:
-    app: {{ template "sumologic.labels.app.pvcCleaner.metrics" . }}
+    app: {{ template "sumologic.labels.app.pvcCleaner.events" . }}
     {{- include "sumologic.labels.common" . | nindent 4 }}
 spec:
   schedule: {{ .Values.pvcCleaner.job.schedule | quote }}
@@ -13,9 +13,9 @@ spec:
     spec:
       template:
         metadata:
-          name: {{ template "sumologic.metadata.name.pvcCleaner.metrics" . }}
+          name: {{ template "sumologic.metadata.name.pvcCleaner.events" . }}
           labels:
-            app: {{ template "sumologic.labels.app.pvcCleaner.metrics" . }}
+            app: {{ template "sumologic.labels.app.pvcCleaner.events" . }}
 {{- include "sumologic.labels.common" . | nindent 12 }}
 {{- with .Values.sumologic.podLabels }}
 {{ toYaml . | indent 12 }}
@@ -54,8 +54,7 @@ spec:
             - "bash"
             - "/pvc-cleaner/pvc-cleaner.sh"
             - "{{ template "sumologic.namespace" . }}"
-            - "app={{ template "sumologic.labels.app.metrics.statefulset" . }}"
-            - "{{ template "sumologic.metadata.name.metrics.hpa" . }}"
+            - "app={{ template "sumologic.labels.app.events.statefulset" . }}"
             imagePullPolicy: {{ .Values.pvcCleaner.job.image.pullPolicy }}
             resources:
               {{- toYaml .Values.pvcCleaner.job.resources | nindent 14 }}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worth mentioning that we already have separate manifests for metrics and logs: https://github.com/SumoLogic/sumologic-kubernetes-collection/tree/main/deploy/helm/sumologic/templates/pvc-cleaner

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know, and I think 3 copies is enough to start making them DRY. It'd be best if we could have just one, with different parameters depending on config, but I'm not sure how realistic that is.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can create issue for that. Right now I'm on customer support, so I would like to have it done rather quickly with low effort

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If that happens, the Pod will fail to start in the new AZ, because the PVC name is the same, so the previous PVC would need to be manually deleted anyway.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is what pvcCleaner is for. To remove this pvc, which cannot be attached to the pod
\

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pvcCleaner was originally for cleaning up unused PVCs after metadata downscaling. It was definitely not for fixing AZ-related volume mounting problems. And it doesn't run frequently enough by default to be able to do this effectively anyway.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I disagree, the original problem was AZ problem AFAIR, but it was result of non-cleaning unused PVCs. I think we can discuss it internally 😅

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Outcome after tests is that this PR doesn't resolve the issue. Closed in favor of #3639

@sumo-drosiek
Copy link
Contributor Author

Closing in favor of #3639

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants