#helmfile (2023-08)

https://github.com/helmfile/helmfile

Questions and discussion around helmfile https://github.com/roboll/helmfile and https://github.com/cloudposse/helmfiles

Archive: https://archive.sweetops.com/helmfile/

2023-08-02

2023-08-04

Pablo Silveira avatar
Pablo Silveira
#960 Additional values not working in Nested State

Operating system

Ubuntu 21.10 impish

Helmfile Version

0.155.0

Helm Version

version.BuildInfo{Version:”v3.12.1”, GitCommit:”f32a527a060157990e2aa86bf45010dfb3cc8b8d”, GitTreeState:”clean”, GoVersion:”go1.20.4”}

Bug description General description

I am using this feature: # Advanced Configuration: Nested States
Specifically: # Values files merged into the nested state’s values
Explained: here.

Scenario

Remote repository structure:
sre-tools-helm-centralized
├── helmfiles
│ └── monitoring
│ └── global
│ ├── kube-prometheus-stack.yaml
│ └── loki-stack.yaml
└── values
├── kube-prometheus-stack
│ └── global
│ ├── kube-prometheus-stack-alertmanager.yaml
│ ├── kube-prometheus-stack-alertmanager.yaml.gotmpl
│ ├── kube-prometheus-stack-grafana.yaml.gotmpl
│ ├── kube-prometheus-stack-prometheus.yaml.gotmpl
And I am working in this folder:
01_monitoring
├── helmfile.d
│ └── helmfile-v3.yaml
└── values
└── kube-prometheus-stack
└── kube-prometheus-stack-alertmanager.yaml

File contents are detailed bellow.

Result:

I am not getting values merged in this way.

###Additional intents of resolving and further debugging

• I tried to reference and absolute path in this additional values file, but did not work. • In the debug I see that this file was rendered. • A possibility could be that the file values are overwritten by the values of the remote repository, but it is only a supposition

Example helmfile.yaml #This is the yaml file I am deploying, in the same relative location I have the additional value referenced: ../values/kube-prometheus-stack/kube-prometheus-stack-alertmanager.yaml

helmBinary: helm3
missingFileHandler: Error
namespace: {{ requiredEnv “NAMESPACE” }}
helmfiles:

• path: “git:[email protected]/sre-prod/sre-tools/sre-tools-helm-centralized@/helmfiles/monitoring/global/kube-prometheus-stack.yaml?ref=v1.1.1”
selectors: • name=kube-prometheus-stack
values: • ../values/kube-prometheus-stack/kube-prometheus-stack-alertmanager.yaml
#This yaml is located in other repo git:[email protected]/sre-prod/sre-tools/sre-tools-helm-centralized@/helmfiles/monitoring/global/kube-prometheus-stack.yaml?ref=v1.1.1


helmBinary: helm3
namespace: monitoring
repositories:

• name: prometheus-community
url: https://prometheus-community.github.io/helm-charts
releases: • name: kube-prometheus-stack
chart: prometheus-community/kube-prometheus-stack
version: 41.7.3
values: • ../../../values/kube-prometheus-stack/global/kube-prometheus-stack-alertmanager.yaml • ../../../values/kube-prometheus-stack/global/kube-prometheus-stack-alertmanager.yaml.gotmpl • ../../../values/kube-prometheus-stack/global/kube-prometheus-stack-grafana.yaml.gotmpl • ../../../values/kube-prometheus-stack/global/kube-prometheus-stack-prometheus.yaml.gotmpl

Error message you’ve seen (if any)

DEBUG

Steps to reproduce

https://github.com/Pela2silveira/misc/blob/main/helmfile/helmfile-nestated-state-add-values

Working Helmfile Version

no knowledge of this working

Relevant discussion

No response

2023-08-22

Xu Pengfei avatar
Xu Pengfei

Hi forks! https://kcl-lang.io/docs/user_docs/guides/working-with-k8s/mutate-manifests/helmfile-kcl-plugin Let me introduce you to the helmfile KCL plugin, which allows you to write KCL code in helmfile.yaml to edit or validate existing help charts. In addition, KCL is a configuration language mainly targeting Kubernetes scenarios, and its documentation is here. Welcome to provide feedback

Brandon avatar
Brandon

Helmfile releases define a chart and a version. For those of you who use a single helmfile across multiple environments (dev/prod for example), how are you managing your chart between environments? As an example, dev and prod might both use chartA at version 0.1.0. When chartA is updated by the chart maintainer to 0.1.1, you may want to test 0.1.1 on dev but not on prod. How do you handle this case? One possible solution is using go templates in the helmfile.yaml, i.e.

  - name: my-release
    namespace: my-namespace
    chart: chartA
    {{ if prod }}
    version: 0.1.0
    {{ else }
    version: 0.1.1
    {{ end }}

(it is not valid syntax, but you get the idea)

Does anyone use a solution other than that? If so, can you describe your solution?

yxxhero avatar
yxxhero

you can try to set a value in environment values. example chart_version: <verison>,

  - name: my-release
    namespace: my-namespace
    chart: chartA
    version: {{ .Values.chart_version }}
1
Sebastian Ponovescu avatar
Sebastian Ponovescu

That’s how we do it as well. I find it cleaner. Is just that we add a default value there, in case is not defined, so that on prod you get the “well established” default chart, and in dev you can experiment, then you just change the default.

Brandon avatar
Brandon

Yes, syntactically that is cleaner. Thanks both.

1
javierlgac avatar
javierlgac

have you tried with a ternary similar to:

{{ ternary .Values.stable_version .Values.dev_version (eq .Values.env "prod")}}
1

2023-08-23

2023-08-25

Nituica Radu avatar
Nituica Radu

Hello, We are trying to understand how we can override the helm values. For example, we have a helmfile like this declared in the components: releases:

  • name: kyverno namespace: kyverno labels: kubernetes.io/metadata.name: kyverno chart: kyverno/kyverno version: 2.6.5 values: - kyverno_values.yaml

And then in the kyverno_values.yaml: podLabels: domain: appli productname: kyverno

How can we modify the labels with stacks? We tried like this but with no efect:

components: helmfile: kyverno: vars: podLabels: domain: test productname: kyvernotest

We would like to apply the modified lables when we install the kyverno for that stack “ atmos helmfile apply kyverno -s dienv”

yxxhero avatar
yxxhero

could you post a demo? thanks.

2023-08-26

2023-08-29

josephgardner avatar
josephgardner

Is there a reason the roboll/helmfile repo isn’t archived? That would fix SEO and prevent people creating new issues in the wrong repo

z0rc3r avatar

Yes, it’s a long one, read linked ticket in roboll/helmfile readme

z0rc3r avatar

as of why, you can ping roboll, who isn’t very responsive

josephgardner avatar
josephgardner

Nothing conclusive in the linked ticket. Seems like the new maintainers are asking for it to be archived.

z0rc3r avatar


Nothing conclusive
That’s your answer

Erik Osterman (Cloud Posse) avatar
Erik Osterman (Cloud Posse)

Yea, it’s odd to me why Roboll hasn’t taken any active interest in the project he created, despite not being actively involved. Anyways, it’s of course his choice. Also, no one else has the rights to update the repo settings. He’s the only admin.

josephgardner avatar
josephgardner

I don’t understand the rationale for updating the readme to say the repo is dead and links to the new repo, but refuses to archive.

z0rc3r avatar

Code changes, including readme can be done by maintainers with push access. Archival can be done only by owner. There is single owner of this repo, who isn’t willing to do so

josephgardner avatar
josephgardner


who isn’t willing to do so
Is that opinion made publicly anywhere? Or just back channel communication? Obviously he can do whatever he wants. I was just looking for the specific reason

z0rc3r avatar
Comment on #1824 Transfer this repository to a dedicated GitHub organization

I don’t follow the project much anymore. If there are permissions I can grant for the repo i am happy to do so. I don’t have any interest in moving the repo, so otherwise please feel free to fork.

Thanks again to @mumoshu for all of the maintainership and hard work, and good luck with the future of the project.

josephgardner avatar
josephgardner

No mention of archive. Perhaps that point wasn’t understood. I’ll try contacting him.

josephgardner avatar
josephgardner
Comment on #2178 Please archive this repo if it's no longer being used

Hi @roboll,

I hope you’re doing well. I wanted to chime in on the request to archive the original Helmfile repository, roboll/helmfile. Your contributions to the Helm ecosystem have been invaluable, and I appreciate the work you’ve put into Helmfile.

Considering the transition to the new repository, helmfile/helmfile, I believe that archiving the original repository could bring several benefits to the Helmfile community:

Clear Communication: Archiving the original repository would provide clear communication that development has shifted to the new repository, reducing confusion and misdirection.

Searchable Archive: GitHub’s archive feature would enable users to search and access historical conversations and problem-solving discussions, ensuring that valuable knowledge is retained.

Enhanced Focus: Directing users to the active repository, helmfile/helmfile, allows the new maintainers and contributors to focus their efforts on advancing the project, rather than managing outdated issues and pull requests.

Unarchiving Flexibility: GitHub allows easy unarchiving of repositories. Should the need arise in the future, the original repository can be revived.

While I understand that you may have your reasons for keeping the original repository accessible, I hope you’d consider the collective benefits of archiving it. The steps to archive a repository are straightforward:

  1. Under your repository name, click on “Settings.”
  2. Under the “Danger Zone” section, select “Archive this repository.”
  3. Enter the repository name and click “I understand the consequences, archive this repository.”

By archiving the original repository, you’d contribute to a more streamlined and focused community experience. Users will be able to find the active repository more easily, and the maintainers can dedicate their energies to improving Helmfile without the distraction of issues being posted in the old repository.

Thank you for your time, your contributions to Helmfile, and your consideration of this proposal. Your continued support of the Helm ecosystem is greatly appreciated.

4

2023-08-30

    keyboard_arrow_up