Monday, December 16, 2024

Terraform

TerraForm

  1. Create "n" nos of clould apps ,Online Apps , Link and Configure
  2. InfraStructure as Code (IAC)

Files Structure :

Pre-Req :

  • main.tf - Main File
  • dependencies.tf (depends_on) - Pre-Req
  • required_providers.tf - Download Interface communicate to target
  • providers.tf - Configure required_providers

Declaration :

  • variables.tf - Declare input paramters

  • terraform.auto.tfvars - Pass values to variables.tf

  • locals.tf - Local variables for internal Calculations

  • data.tf - Used to read from File

  • outputs.tf - Print Output

Operations

  1. data : get information from existing resources
  2. resource : Smallest operation which can be performed
  3. module : Function Call

Sample Code

Pre-Req: Docker is running locally

# Terraform Interface for Docker
terraform {
  required_providers {
    docker = { source = "kreuzwerker/docker" }
  }
}

# Use Default Config for above docker
provider "docker" {}

# Download nginx Image
resource "docker_image" "nginx" {
  name = "nginx"
}

# Define a map of container configurations
locals {
  containers = {
    c1 = { external_port = 8001 },
    c2 = { external_port = 8002 }
  }
}

# Create Nginx containers using a map
resource "docker_container" "nginx" {
  for_each = local.containers

  image = docker_image.nginx.image_id
  name  = each.key  # Unique name for each container

  ports {
    internal = 80
    external = each.value.external_port  # Use the external port from the map
  }
}

# Outputs
output "container_names" {
  value = [for c in docker_container.nginx : c.name]
}
view raw Terraform.md hosted with ❤ by GitHub

Sunday, July 28, 2024

Kubernetes , Prometheus and Grafana

Kubernetes with Prometheus and Grafane

  • Helm Chart : Package installer for kubernetes

  • Prometheus: Open-source monitoring and alerting toolkit ( A Prometheus deployment needs dedicated storage space to store scraping data)

  • Grafana : visualization, and observability in Prometheus

1. Pre-Req

minikube start 		# minikube start --memory='12g' --cpus='4' -n 3
kubectl config view | grep namespace #get current namespace
kubectl cluster-info
minikube addons list

# Extra 
		minikube addons enable csi-hostpath-driver	 
		# Enable dedicated storage space to store scraping data  
		kubectl patch storageclass csi-hostpath-sc -p '{"metadata": {"annotations":{"storageclass.kubernetes.io/is-default-class":"true"}}}'
		kubectl patch storageclass standard -p '{"metadata": {"annotations":{"storageclass.kubernetes.io/is-default-class":"false"}}}'

2. helm:

	Install helm - https://helm.sh/docs/intro/install/
	kubectl get ns
	kubectl create namespace monitoring
	kubectl get deployments --namespace=monitoring
	kubectl get pods --namespace=monitoring
	kubectl get configmap -n monitoring
	helm repo add prometheus-community https://prometheus-community.github.io/helm-charts
	helm repo list
	helm repo update

3. prometheus:

	helm install prometheus prometheus-community/prometheus --namespace monitoring
	kubectl get pods -n monitoring
	kubectl expose service prometheus-server --namespace monitoring --type=NodePort --target-port=9090 --name=prometheus-server-ext
	minikube ip
	kubectl get svc -n monitoring
	#Prometheus Server UI = minikubeIP + 30333 [prometheus-server-ext port] eg : 192.168.49.2:30333

4. grafana

	helm repo add grafana https://grafana.github.io/helm-charts
	helm install grafana grafana/grafana --namespace monitoring
	#run the command in the display to get password -SCaxM0KE4GRzxxxx
	
	kubectl expose service grafana --namespace monitoring --type=NodePort --target-port=3000 --name=grafana-ext
	kubectl get svc -n monitoring
	#Grafana  UI = minikubeIP + 32084 [grafana-ext ] 

5. Integrate Promethus to Grafana

	kubectl get svc -n monitoring #copy PORT
	Open URL Grafana = minikubeIP + 32084 [grafana-ext port]
	usn = admin , password from output command during installation	
	Home > Add first Datasource > prometheus >Add URL =  http://prometheusURL >save and test
	Home > Dashboard > New > Import > Enter 3662 >Load

Sunday, July 21, 2024

Multi thread in Python

'''
threading
1. No return
2. Can run seperatr function
concurrent.futures
1. Return
2. Only run 1 Function in 1 map
'''
from concurrent.futures import *
import threading
import time
l=[]
def fn(i):
print(f"{threading.get_native_id()}")
time.sleep(i)
l.append(i)
return i
t1=threading.Thread(target=fn,args=(1,))
t1.start()
t1.join()
with ThreadPoolExecutor(2) as e:
res=e.map(fn,[2,3])
for i in res:print(i)
print(l)
view raw multiThread.py hosted with ❤ by GitHub

Async in Python

'''
Calling Normal Function using Async Functions
1. to_thread
2. gather+to_thread
Calling ASync Function using Async Functions
1. No need of to_thread
2. only gather
'''
import asyncio
import threading
import time
#Normal Function
def normal(a):
print("Running normal")
time.sleep(a)
print(f"{threading.get_native_id()}")
print(f"done wait {a}")
#Async Function
async def asyncFn(i):
print("Running asyncFn")
await asyncio.sleep(i)
print(i)
return i
async def main():
t1=asyncio.create_task(asyncFn(3))
result1=await t1
await asyncio.gather(asyncFn(3),asyncFn(1))
await asyncio.to_thread(normal,1)
await asyncio.gather(asyncio.to_thread(normal,3),asyncio.to_thread(normal,1))
asyncio.run(main())
view raw async.py hosted with ❤ by GitHub

Wednesday, May 22, 2024

Versioning

Versioning

  1. Calender Versioning
  2. Semantic Versioning

Calender Versioning

https://calver.org/

  • ubuntu 16.04 = Ubuntu October,2016
  • Pycharm 2022.3.2

Semantic Versioning

https://semver.org/

MAJOR_VERSION.MINOR_VERSION.PATCH_VERSION Ex:5.4.2 = Major version 5, Minor version 4, Patch version 2.

Patch versions = bug fixes Minor version = new functionality Patch version = breaking changes are introduced

Notes:

  1. Initial version of development 0.1.0 (early development ).
  2. Once the public API is stable, then a release to version 1.0.0

Pre-Release /Beta

  1. If a library was in version 2.8.9
  2. If there is a plan to release beta for 3.0.0
  3. beta Version release willbe 3.0.0-beta.1
view raw versioning.md hosted with ❤ by GitHub

Sunday, January 7, 2024

java.lang.NoSuchMethodError: org.apache.spark.sql.AnalysisException.(Ljava/lang/String;Lscala/Option;Lscala/Option

"""
PYSPARK
Py4JJavaError: An error occurred while calling o560.save.
: java.lang.NoSuchMethodError: org.apache.spark.sql.AnalysisException.<init>(Ljava/lang/String;Lscala/Option;Lscala/Option;Lscala/Option;Lscala/Option;)V
at org.apac
"""
df.withColumn("key",lit(1)) #Wrong
#lit should be String
df.withColumn("key",lit("a")) #COrrect
view raw error.py hosted with ❤ by GitHub