Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't use my kubernetes provider if it's defined inside the module #51

Open
g-iannelli opened this issue Mar 1, 2023 · 0 comments
Open
Labels
bug Something isn't working

Comments

@g-iannelli
Copy link
Contributor

cat main.tf
/**
 * Copyright (c) 2017-present SIGHUP s.r.l All rights reserved.
 * Use of this source code is governed by a BSD-style
 * license that can be found in the LICENSE file.
 */

terraform {
  experiments = [module_variable_optional_attrs]
  backend "s3" {
    bucket = "furyctl-issue-196"
    key    = "barebone/use1/cluster.json"
    region = "us-east-1"
  }

  required_providers {
    aws = {
      source = "hashicorp/aws"
    }
    kubernetes = {
      source = "hashicorp/kubernetes"
    }
  }
}

provider "aws" {
  region = "us-east-1"
  default_tags {
    tags = {
      env = "product-day-qa"
      githubIssue = "303"
      k8s = "eks-barebone"
    }
  }
}

provider "kubernetes" {
   host = data.aws_eks_cluster.fury.endpoint
   cluster_ca_certificate = base64decode(data.aws_eks_cluster.fury.certificate_authority[0].data)
   token = data.aws_eks_cluster_auth.fury.token
}

module "fury" {
  source = "~/.furyctl/eks-barebone/vendor/installers/eks/modules/eks"

  cluster_name               = var.cluster_name
  cluster_version            = var.cluster_version
  cluster_log_retention_days = var.cluster_log_retention_days
  network                    = var.network
  subnetworks                = var.subnetworks
  dmz_cidr_range             = var.dmz_cidr_range
  ssh_public_key             = var.ssh_public_key
  node_pools                 = var.node_pools
  node_pools_launch_kind     = var.node_pools_launch_kind
  tags                       = var.tags

  # Specific AWS variables.
  # Enables managing auth using these variables
  eks_map_users    = var.eks_map_users
  eks_map_roles    = var.eks_map_roles
  eks_map_accounts = var.eks_map_accounts
}

cat .terraform/modules/fury/kubernetes.tf
data "aws_eks_cluster" "cluster" {
  name = module.cluster.cluster_id
}

data "aws_eks_cluster_auth" "cluster" {
  name = module.cluster.cluster_id
}

provider "kubernetes" {
  host                   = data.aws_eks_cluster.cluster.endpoint
  cluster_ca_certificate = base64decode(data.aws_eks_cluster.cluster.certificate_authority.0.data)
  token                  = data.aws_eks_cluster_auth.cluster.token
  load_config_file       = false
}
terraform apply
module.fury.module.cluster.aws_cloudwatch_log_group.this[0]: Refreshing state... [id=/aws/eks/eks-barebone/cluster]
module.fury.aws_key_pair.nodes: Refreshing state... [id=eks-barebone-20230301083258319400000001]
module.fury.module.cluster.aws_security_group.cluster[0]: Refreshing state... [id=sg-0f0046bd94eebbe6d]
module.fury.aws_security_group.nodes: Refreshing state... [id=sg-0905fc5bc2b95d013]
module.fury.module.cluster.aws_security_group.workers[0]: Refreshing state... [id=sg-0e4e1c5b331e9c995]
module.fury.aws_security_group.node_pool[2]: Refreshing state... [id=sg-096878a7e921d4246]
module.fury.aws_security_group.node_pool[1]: Refreshing state... [id=sg-020ae1b0b10ac4ea5]
module.fury.aws_security_group.node_pool[0]: Refreshing state... [id=sg-0976063af499fbac7]
module.fury.module.cluster.aws_iam_policy.cluster_elb_sl_role_creation[0]: Refreshing state... [id=arn:aws:iam::492816857163:policy/eks-barebone/eks-barebone-elb-sl-role-creation20230301083258320600000002]
module.fury.module.cluster.aws_iam_role.cluster[0]: Refreshing state... [id=eks-barebone20230301083258320800000003]
module.fury.aws_security_group_rule.ssh_from_dmz_to_nodes: Refreshing state... [id=sgrule-2968004512]
module.fury.module.cluster.aws_security_group_rule.workers_ingress_self[0]: Refreshing state... [id=sgrule-2573013755]
module.fury.module.cluster.aws_security_group_rule.workers_egress_internet[0]: Refreshing state... [id=sgrule-2558094647]
module.fury.module.cluster.aws_security_group_rule.cluster_https_worker_ingress[0]: Refreshing state... [id=sgrule-1596022399]
module.fury.module.cluster.aws_security_group_rule.workers_ingress_cluster_https[0]: Refreshing state... [id=sgrule-4230566281]
module.fury.module.cluster.aws_security_group_rule.workers_ingress_cluster[0]: Refreshing state... [id=sgrule-2751700965]
module.fury.module.cluster.aws_security_group_rule.cluster_egress_internet[0]: Refreshing state... [id=sgrule-4246159717]
module.fury.module.cluster.aws_iam_role_policy_attachment.cluster_AmazonEKSVPCResourceControllerPolicy[0]: Refreshing state... [id=eks-barebone20230301083258320800000003-2023030108330150460000000a]
module.fury.module.cluster.aws_iam_role_policy_attachment.cluster_elb_sl_role_creation[0]: Refreshing state... [id=eks-barebone20230301083258320800000003-20230301083301493800000008]
module.fury.module.cluster.aws_iam_role_policy_attachment.cluster_AmazonEKSClusterPolicy[0]: Refreshing state... [id=eks-barebone20230301083258320800000003-20230301083301481900000007]
module.fury.module.cluster.aws_iam_role_policy_attachment.cluster_AmazonEKSServicePolicy[0]: Refreshing state... [id=eks-barebone20230301083258320800000003-20230301083301500700000009]
module.fury.module.cluster.aws_eks_cluster.this[0]: Refreshing state... [id=eks-barebone]
module.fury.module.cluster.aws_iam_role.workers[0]: Refreshing state... [id=eks-barebone2023030108451346050000000b]
module.fury.module.cluster.aws_iam_role_policy_attachment.workers_AmazonEKS_CNI_Policy[0]: Refreshing state... [id=eks-barebone2023030108451346050000000b-20230301084516827400000011]
module.fury.module.cluster.aws_iam_role_policy_attachment.workers_AmazonEKSWorkerNodePolicy[0]: Refreshing state... [id=eks-barebone2023030108451346050000000b-2023030108451680890000000f]
module.fury.module.cluster.aws_iam_role_policy_attachment.workers_AmazonEC2ContainerRegistryReadOnly[0]: Refreshing state... [id=eks-barebone2023030108451346050000000b-20230301084516808900000010]
module.fury.module.cluster.aws_iam_instance_profile.workers_launch_template[0]: Refreshing state... [id=eks-barebone2023030108451615630000000d]
module.fury.module.cluster.aws_iam_instance_profile.workers_launch_template[2]: Refreshing state... [id=eks-barebone2023030108451615640000000e]
module.fury.module.cluster.aws_iam_instance_profile.workers_launch_template[1]: Refreshing state... [id=eks-barebone2023030108451615610000000c]
module.fury.module.cluster.aws_launch_template.workers_launch_template[1]: Refreshing state... [id=lt-059ae5d3ea75f17ba]
module.fury.module.cluster.aws_launch_template.workers_launch_template[0]: Refreshing state... [id=lt-06a755105d5409348]
module.fury.module.cluster.kubernetes_config_map.aws_auth[0]: Refreshing state... [id=kube-system/aws-auth]
module.fury.module.cluster.aws_launch_template.workers_launch_template[2]: Refreshing state... [id=lt-0292a97d1426825e2]
module.fury.module.cluster.aws_autoscaling_group.workers_launch_template[0]: Refreshing state... [id=eks-barebone-infra20230301084520960500000019]
module.fury.module.cluster.aws_autoscaling_group.workers_launch_template[2]: Refreshing state... [id=eks-barebone-app20230301084520960500000018]
module.fury.module.cluster.aws_autoscaling_group.workers_launch_template[1]: Refreshing state... [id=eks-barebone-ingress2023030108452096060000001a]
╷
│ Warning: Experimental feature "module_variable_optional_attrs" is active
│
│   on main.tf line 8, in terraform:
│    8:   experiments = [module_variable_optional_attrs]
│
│ Experimental features are subject to breaking changes in future minor or patch releases, based on feedback.
│
│ If you have feedback on the design of this feature, please open a GitHub issue to discuss it.
│
│ (and one more similar warning elsewhere)
╵
╷
│ Error: Get "http://localhost/api/v1/namespaces/kube-system/configmaps/aws-auth": dial tcp [::1]:80: connect: connection refused
│
│   with module.fury.module.cluster.kubernetes_config_map.aws_auth[0],
│   on .terraform/modules/fury.cluster/aws_auth.tf line 65, in resource "kubernetes_config_map" "aws_auth":
│   65: resource "kubernetes_config_map" "aws_auth" {
│
╵

Removing provider from module, terraform use the right cluster endpoint.

cat .terraform/modules/fury/kubernetes.tf
data "aws_eks_cluster" "cluster" {
  name = module.cluster.cluster_id
}

❯ terraform apply
module.fury.module.cluster.aws_cloudwatch_log_group.this[0]: Refreshing state... [id=/aws/eks/eks-barebone/cluster]
module.fury.module.cluster.aws_security_group.cluster[0]: Refreshing state... [id=sg-0f0046bd94eebbe6d]
module.fury.aws_security_group.nodes: Refreshing state... [id=sg-0905fc5bc2b95d013]
module.fury.aws_key_pair.nodes: Refreshing state... [id=eks-barebone-20230301083258319400000001]
module.fury.aws_security_group.node_pool[0]: Refreshing state... [id=sg-0976063af499fbac7]
module.fury.module.cluster.aws_security_group.workers[0]: Refreshing state... [id=sg-0e4e1c5b331e9c995]
module.fury.aws_security_group.node_pool[1]: Refreshing state... [id=sg-020ae1b0b10ac4ea5]
module.fury.aws_security_group.node_pool[2]: Refreshing state... [id=sg-096878a7e921d4246]
module.fury.module.cluster.aws_iam_policy.cluster_elb_sl_role_creation[0]: Refreshing state... [id=arn:aws:iam::492816857163:policy/eks-barebone/eks-barebone-elb-sl-role-creation20230301083258320600000002]
module.fury.module.cluster.aws_iam_role.cluster[0]: Refreshing state... [id=eks-barebone20230301083258320800000003]
module.fury.aws_security_group_rule.ssh_from_dmz_to_nodes: Refreshing state... [id=sgrule-2968004512]
module.fury.module.cluster.aws_security_group_rule.cluster_egress_internet[0]: Refreshing state... [id=sgrule-4246159717]
module.fury.module.cluster.aws_security_group_rule.cluster_https_worker_ingress[0]: Refreshing state... [id=sgrule-1596022399]
module.fury.module.cluster.aws_security_group_rule.workers_ingress_self[0]: Refreshing state... [id=sgrule-2573013755]
module.fury.module.cluster.aws_security_group_rule.workers_ingress_cluster_https[0]: Refreshing state... [id=sgrule-4230566281]
module.fury.module.cluster.aws_security_group_rule.workers_ingress_cluster[0]: Refreshing state... [id=sgrule-2751700965]
module.fury.module.cluster.aws_security_group_rule.workers_egress_internet[0]: Refreshing state... [id=sgrule-2558094647]
module.fury.module.cluster.aws_iam_role_policy_attachment.cluster_AmazonEKSServicePolicy[0]: Refreshing state... [id=eks-barebone20230301083258320800000003-20230301083301500700000009]
module.fury.module.cluster.aws_iam_role_policy_attachment.cluster_AmazonEKSVPCResourceControllerPolicy[0]: Refreshing state... [id=eks-barebone20230301083258320800000003-2023030108330150460000000a]
module.fury.module.cluster.aws_iam_role_policy_attachment.cluster_AmazonEKSClusterPolicy[0]: Refreshing state... [id=eks-barebone20230301083258320800000003-20230301083301481900000007]
module.fury.module.cluster.aws_iam_role_policy_attachment.cluster_elb_sl_role_creation[0]: Refreshing state... [id=eks-barebone20230301083258320800000003-20230301083301493800000008]
module.fury.module.cluster.aws_eks_cluster.this[0]: Refreshing state... [id=eks-barebone]
module.fury.module.cluster.aws_iam_role.workers[0]: Refreshing state... [id=eks-barebone2023030108451346050000000b]
module.fury.module.cluster.aws_iam_role_policy_attachment.workers_AmazonEKS_CNI_Policy[0]: Refreshing state... [id=eks-barebone2023030108451346050000000b-20230301084516827400000011]
module.fury.module.cluster.aws_iam_role_policy_attachment.workers_AmazonEKSWorkerNodePolicy[0]: Refreshing state... [id=eks-barebone2023030108451346050000000b-2023030108451680890000000f]
module.fury.module.cluster.aws_iam_role_policy_attachment.workers_AmazonEC2ContainerRegistryReadOnly[0]: Refreshing state... [id=eks-barebone2023030108451346050000000b-20230301084516808900000010]
module.fury.module.cluster.aws_iam_instance_profile.workers_launch_template[0]: Refreshing state... [id=eks-barebone2023030108451615630000000d]
module.fury.module.cluster.aws_iam_instance_profile.workers_launch_template[2]: Refreshing state... [id=eks-barebone2023030108451615640000000e]
module.fury.module.cluster.aws_iam_instance_profile.workers_launch_template[1]: Refreshing state... [id=eks-barebone2023030108451615610000000c]
module.fury.module.cluster.aws_launch_template.workers_launch_template[1]: Refreshing state... [id=lt-059ae5d3ea75f17ba]
module.fury.module.cluster.aws_launch_template.workers_launch_template[2]: Refreshing state... [id=lt-0292a97d1426825e2]
module.fury.module.cluster.aws_launch_template.workers_launch_template[0]: Refreshing state... [id=lt-06a755105d5409348]
module.fury.module.cluster.kubernetes_config_map.aws_auth[0]: Refreshing state... [id=kube-system/aws-auth]
module.fury.module.cluster.aws_autoscaling_group.workers_launch_template[0]: Refreshing state... [id=eks-barebone-infra20230301084520960500000019]
module.fury.module.cluster.aws_autoscaling_group.workers_launch_template[2]: Refreshing state... [id=eks-barebone-app20230301084520960500000018]
module.fury.module.cluster.aws_autoscaling_group.workers_launch_template[1]: Refreshing state... [id=eks-barebone-ingress2023030108452096060000001a]
╷
│ Warning: Experimental feature "module_variable_optional_attrs" is active
│
│   on main.tf line 8, in terraform:
│    8:   experiments = [module_variable_optional_attrs]
│
│ Experimental features are subject to breaking changes in future minor or patch releases, based on feedback.
│
│ If you have feedback on the design of this feature, please open a GitHub issue to discuss it.
│
│ (and one more similar warning elsewhere)
╵
╷
│ Error: Get "https://BEF030853566ED2382C53403F9949892.gr7.us-east-1.eks.amazonaws.com/api/v1/namespaces/kube-system/configmaps/aws-auth": dial tcp 10.10.9.152:443: i/o timeout
│
│   with module.fury.module.cluster.kubernetes_config_map.aws_auth[0],
│   on .terraform/modules/fury.cluster/aws_auth.tf line 65, in resource "kubernetes_config_map" "aws_auth":
│   65: resource "kubernetes_config_map" "aws_auth" {
│
╵
@g-iannelli g-iannelli added the bug Something isn't working label Mar 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant