This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
trait A class B extends A class C extends B object LowerBoundGeneric extends App { class Test[A >: B](val x: A) //Can have of type A and B not
C val temp = new B() // new C() = Fail val test: Test[B] = new Test[B](temp) } object CovariantGeneric extends App { class Test2[+A]{ def run[B >: A](element: B)=print("working")
} val temp2 =new C() // new C() = Fail new Test2[B]().run(temp2) }
Apply
//whereby the compiler converts f(a) into f.apply(a) object Applytest extends App{ class Foo(x: Int) { def apply(y: Int) =x+y} val f = new Foo(3) println(f(4)) // returns 25 }
Partial Function
/* function is f: X -> Y, A partial function = Does not force f to map every element of X to
an element of Y ie., several subpartial function to handle differnt elements in same data
set new PartialFunction[input , output] if "isDefined" is true than execute "apply" orElse, andthen */ object Partialtest extends App{ val sample = 1 to 5 val isEven = new PartialFunction[Int, String] { def apply(x: Int) = x + " is even" def isDefinedAt(x: Int) = (x != 0 && x%2 ==
0) } val isOdd: PartialFunction[Int, String] = { case x if x % 2 == 1 => x + " is odd" } val evenNumbers = sample map (isEven orElse isOdd) print(evenNumbers) }
Companion Object
/* Companion object and its class can access each other’s private members
(fields and methods) Have same name Same file */ object CompanionTest extends App{ class Person {var name = ""} object Person { def apply(name: String): Person = { var p = new Person() p.name = name p } } print(Person("Fred Flinstone").name) //Person.apply("Fred
Flinstone"). }
Future
/* Anything inside Future {}, is run in a different thread Application’s main thread doesn’t stop for Future to Complete Result of Future is always Try types: Success or Failure To make main thread wait scala.concurrent.Await.result(future,15.seconds)
is used isComplete , value ,map , collect */ object FutureTest extends App{ import scala.concurrent.Future import scala.concurrent.ExecutionContext.Implicits.global import scala.util.{Failure, Success} val f1:Future[Int] = Future { Thread.sleep(1000); 21 + 21 } while(f1.isCompleted!=true){println("future operation completed ??
- "+f1.isCompleted)} println(f1.value) val f2:Future[Int]=f1.map(i => i+1) f2.onComplete { case Success(value) => println(s"Got the callback, value
= $value") case Failure(e) => e.printStackTrace } }
Implicit
object ImplicitTest extends App{ case class Person(name: String) {def greet = println(s"Hi, my name
is $name")} implicit def fromStringToPerson(name: String) = Person(name) "Peter".greet }
Copy the api-key . url under : Language Translator >Service
Credentials
Replace api-key and url (More REST calls : Language Translator
>GettingStarted)
curl -X POST --user "apikey:{apikey}"
\ --header "Content-Type: text/plain"
\ --data "Language Translator translates text from one language to another"
\
"{url}/v3/identify?version=2018-05-01"
open Ibmcloud Shell from the ibmcloud tool bar
Run the new Command
Cloudant Database
Login to IBMCloud
Goto Catalog
Select and Create a Cloudant Instance
Open the Cloudant Instance provisioned from Resource List > Services
and Software >Cloudant
Click on Manage > Launch Dashboard
Create Database > test >Click on Create
Open test DB > Design Document > New Doc > add new json key
value
eg:
{
"_id": "ce9575de70477c932e222bf5b6bd7fea",
"name": "deepak"
}
Click on Create Document
Lets fetch this document from API
Under Cloudant page > Service Credentails > Create New Role >
Manager >Add
Open the New Service Credentails Created , Note down apikey , url
Open ibmcli from ibmcloud tool bar
(https://cloud.ibm.com/docs/account?topic=account-iamtoken_from_apikey&interface=api)
$curl -X POST 'https://iam.cloud.ibm.com/identity/token' -H
'Content-Type: application/x-www-form-urlencoded' -d
'grant_type=urn:ibm:params:oauth:grant-type:apikey&apikey=<MY_APIKEY>'
Copy the Token generated
Run below commands
API_BEARER_TOKEN=<paste token here>
curl -H "Authorization: Bearer $API_BEARER_TOKEN" -X GET "{url}/test/{_id
from cloudant}"
Other Api:
curl -H "Authorization: Bearer $API_BEARER_TOKEN" -X PUT /{db}" #Create DB
curl -H "Authorization: Bearer $API_BEARER_TOKEN" -X PUT /{db}/{doc_id}"
Create Document
curl -H "Authorization: Bearer $API_BEARER_TOKEN" -X GET "{url}/test/{_id
from cloudant}" #Read Document
U can create a proxy link for "https://eu-gb.functions.cloud.ibm.com/api/v1/namespaces/j.thepac%40gmail.com_dev/actions/hello-world/helloworld?blocking=true" link by Creating ApiGateWay and providing the above url .
Simple ETL from COS to DB2
Pre- Req:
DB2:
Make sure u have created a DB2 instance in IBMCLoud
Create a table in DB2 (do not insert any records)
CREATE TABLE table_name (col1 int, col1 varchar(255)); -- successfully
created
In Db2 Ui > Data icon > Tables
Click on the scheme
check if the table is created
Test it
Syntax : Select * from scheme.table;
Example:Select * from DXC02390.table_name;
note down the Scheme name and table name
Click on about icon in DB2 UI
Note down from "<crn ..........::>"
Cloudant:
Create a Cloudant Object Storage (COS) in IBM Cloud
Create a Bucket
Add a parq File , with scheme similar to the above Table created (use
apache spark to create the file locally and drag and drop)
Run the below command to copy the data from COS to DB2
Syntax :
SELECT * FROM <Object SQL URL> STORED AS PARQUET
INTO crn:xxxxxxx:/scheme.table PARALLELISM 2
Example:
SELECT * FROM
cos://jp-tok/cloud-object-storage-7d-cos-standard-gsi/test2Cols.parquet
STORED AS PARQUET INTO crn:v1:bluemix:public:dashdb-for-transactions:eu-gb:a/e31b7085afca4ab8b6ac9b1077cd8af9:9257e5bc-49f0-43a1-b776-f7a0ff41b2b6::/DXC02390.MONOREPO_POC
PARALLELISM 2
Copy ETL using REST
Pre-Req: Simple ETL from COS to DB2
curl -X POST 'https://iam.cloud.ibm.com/identity/token' \
-d '{"statement":"SELECT * FROM
cos://jp-tok/cloud-object-storage-7d-cos-standard-gsi/test2Cols.parquet
STORED AS PARQUET INTO
cos://jp-tok/cloud-object-storage-7d-cos-standard-gsi/test2Cols_result"
}'
Run Spark Job on COS Data
login to IBMCLOUD
Goto Catalog > Search for Watson Studio
Agree to terms and conditions> Click on Create
Click On next >Next > click Create Watson Studio
Click on Projects > New Project >Empty Project
Add to Project > Notebook
Select Runtime > python (least configuration)
!pip -q install ibmcloudsql
import ibmcloudsql
cloud_api_key="Create api key from Manage"
sql_crn="crn of SQL Query Instance"
sql_cos_endpoint="cosendpoint of bucket/result_prefix"
query="right click on the COS parq file and click on SQL
Query"
1. make sure bazel is installed in your computer 2. create a new folder as Project 3. cd inside the project folder 4. create a new "WORKSPACE" file 5. create python/Program folder 6. cd to Program 7. Create a new file BUILD file:
package(default_visibility = ["//visibility:public"]) py_binary( name = 'hello', #anyname main = 'hello.py', #reference path eg: parentfolder.file srcs= ['hello.py'], #filename )
8. $echo "print('hi')" > hello.py 9. make sure ur in folder containing BUILD file 10. $bazel run hello
Bazel has default setting for Python and Java ie., u can start with empty WORKSPACE and run python/java source files .
skylib_version = "0.8.0" http_archive( name = "bazel_skylib", type = "tar.gz", url = "https://github.com/bazelbuild/bazel-skylib/releases/download/{}/bazel-skylib.{}.tar.gz".format (skylib_version, skylib_version), sha256 = "2ef429f5d7ce7111263289644d233707dba35e39696377ebab8b0bc701f7818e", )
Load
rules_scala : like scala_binary,scala_test etc., to use in BUILD file
scala_config : Config scala version
scala_register_toolchain : For using the jar file build from 1 languge as input to another
scala repositories : to download default libraries for scala
Set maven as third party repo
IntelliJ Setup
1. Make sure intelliJ has bazel plugin installed 2. import above project as basel project 3. create new 4. next ( if u already have .ijwb/ folder created , make sure it is deleted) 5. done
Common Commands :
bazel build target #target can be name of build or //path of package:target
In this project WORKSPACE is empty because Native rules ship with the Bazel binary and do not require a load statement. Native rules are available globally in BUILD files.
But for scala ,python etc u need to include load statements in workspace and use them in Build files
Steps:
Open link https://github.com/bazelbuild
select repos u need for creating ur project
Example if u want to add "bazel-skylib" (Provides functions , file paths, and data types in build file)
If you're just doing personal projects where nobody else will use the code, then you can make any name .
Don't make up something that starts with com. or net. or other top-level domain though, because that would imply that you own the domain name (ie. using com.john as your package name just because your name happens to be John is not a good idea).
The domain-name-backwards convention is there to prevent name collisions. Two different companies with the same product name will have different namespaces so everything works fine.
import com.google.gson.{Gson, JsonParser} val json="""{"hello": "world", "age": 42}""" val parser:JsonParser= new JsonParser(); val res= parser.parse(json).getAsJsonObject() println(res.get("hello")) // world
//read from file //val path="/path/file.json" //val lines = scala.io.Source.fromFile(path, "utf-8").getLines.mkString
Code 2 (Requires Lift Web - Maven):
import net.liftweb.json._ import net.liftweb.Serialiazation.write case class Address(city:String,Country:String) case class Person(name:String ,address:Address) implicits def formats=DefaultFormats print(write(Person("Sam",Address("NY","US"))
- Cluster: A collection of nodes(Computer) represent a cluster.
- Pod:
- Runs Container Image
- unique ip address
- Pods are the smallest unit in kubernetes
- private, isolated network.
- Container Image : code
- kubectl: Command creates a proxy ,forwarding communications into the cluster using API.
LoadBalancer
- load balancer functionality
- Only Used in Cloud Services ( AWS, GCP ,Azure ..)
- Each service needs new LoadBalancer and new IP address.
Ingress (Similar to port-forward / Simple Service )
- Ingress is an object that allows access to your Kubernetes services from outside
- Needs ingress controller running.
- Ingress controller ( will not run by default )
- Ingress Resource
kubectl get pods --all-namespaces | grep ingress
kubectl get service --all-namespaces | grep ingress
kubectl get Ingress ingress-name
kubectl delete ingress ingress-name
apiVersion: v1
kind: Pod
# Pod / Job
metadata:
name: hi
spec:
containers:
- name: hi
image: ubuntu
command: ['sh', '-c', 'echo "Hello, Kubernetes!" && sleep 3600']
# imagePullPolicy: Never
kubectl create -f hi.yml
kubectl exec -it hi -- /bin/bash
Activities
Activity : Pull local Docker image into minikube
Create a file with name Dockerfile
Add below lines :
FROM alpine:3.4
RUN apk update
RUN apk add vim
RUN apk add curl
open new terminal
minikube start
eval $(minikube docker-env)
docker build -t foo:1.0 .
docker images #Check if foo is created
kubectl run foo -it --image=foo:1.0
- $cat /proc/version
- $exit
kubectl get pods
kubectl get deployments
kubectl delete deployment foo
Activity :Create "Hello world" python program push and pull from remote Docker
Pre-Req:
FROM python:3.7
RUN mkdir /app
WORKDIR /app
ADD . /app/
EXPOSE 5000
CMD ["python","-u", "/app/main.py"]
Steps:
- cd into apps
- sudo docker images #empty table
- sudo docker build -t any_name:v1 . # note there is '.' at the end
- sudo docker images
- sudo docker run -p 4000:80 any_name
- sudo docker images #note down the id /name, usually it is latest
- sudo docker login
- sudo docker tag TAG_id usn/repo_name:TAG_NAME_of_image
#docker tag 3a4677d31cde usn/test_repo:latest
- sudo docker push usn/repo_name:TAG_NAME_of_image
#docker push usn/repo:latest
- kubectl apply -f deployment.yaml #pull image from docker hub & create pod
- kubectl logs -f foo #hi
- kubectl get pods #shows all pods
- kubectl delete pod pod_name #to delete pod
#Status =CrashLoopBackOff. Because we just have 1 print statement , so whenever the application was closing after "hi".The pod try to restart service and had done it mutiple times
Activity : Send arguments from different terminal
kubectl attach redis_container -i
Activity :Forward ports from Pods to your local machine
SubQuery : Simple subquery doesn't use values from the outer query and is being calculated only once:
SELECT id, first_name FROM student_details WHERE id IN (SELECT student_id FROM student_subjects
WHERE subject= 'Science');
CoRelated Subquery - Query To Find all employees whose salary is above average for their department SELECT employee_number, name FROM employees emp WHERE salary > ( SELECT AVG(salary)FROM employees
Hold Raspberry Pi Pico on-board BOOTSEL button (Button on the board)
Plug it into USB (or pulling down the RUN/Reset pin to ground)
It will appear as a USB disk drive
you can copy paste the firmware onto drive
the drive will change for Circuit python and no reaction for Micro Python
Micro Python :
Install and open thonny IDE
IDE should automatically detect pico
Shell
from machine import Pin
led = Pin("LED", Pin.OUT)
led.value(1)
led.value(0)
Code - save as main.py in "MicroPython" device
from machine import Pin
import time
led = Pin("LED", Pin.OUT) # Pin(25, Pin.OUT)
for i in range(1, 5):
print(i)
led.value(1)
time.sleep(1) # Use 1 second instead of 10 seconds for better visibility
led.value(0)
time.sleep(1)
Circuit Python :
Configure Mu Editor so that Code can be seen running in real time., ie as soon as the code is saved , the result reflected in LEDs directly .
sudo apt-get update
sudo apt-get -f upgrade
apt install libfuse2
Download and Install Mu Editor
Run
Copy paste below program into code.py
Save the file in the device
Open Mu Editor
Should automatically recognise PICO and Opens code.py
Blink Program
import board
import time
from digitalio import DigitalInOut, Direction,Pull
led = DigitalInOut(board.LED)
led.direction = Direction.OUTPUT
#Connect LED between Pin1 ie GP0 and Pin 2
op = DigitalInOut(board.GP0)
op.direction = Direction.OUTPUT
while 1:
if led.value==0: led.value= True
elif led.value==1:led.value = False
time.sleep(0.5)
if op.value==0: op.value= True
elif op.value==1:op.value = False
time.sleep(0.5)
Input Switch
import time
import board
import digitalio
button = digitalio.DigitalInOut(board.GP0)
button.switch_to_input(pull=digitalio.Pull.UP )
while True:
print(button.value)
time.sleep(0.5)
1. connect SEEEDuino xiao to PC using TYPE-C cable 2. short RST pins using a cable fast , 2 times. 3. Once done successfully,Audrino drives appears 4. Go website -
5. Download latest .UF2 file 6. Copy and paste it inside the drive 7. Now the drive will be converted to CIRCUITPY 8. Create a file code.py 9. Copy paste below code into code.py (same for all circuit py IC)
import time import board from digitalio import DigitalInOut,Direction
led = DigitalInOut(board.D13) #D13 is a built in LED
#A1 - A10 can be used as well if u use a separate LED and a Resistor 100 - 400 ohms refer below for calculations led.direction=tinker .OUTPUT
while True: led.value = True time.sleep(1) led.value=False time.sleep(1)
10. Save file 11. The LED should start blinking
A simple LED circuit consists of a LED and resistor. The
resistor is used to limit the current that is being drawn and is called a
current limiting resistor. Without the
resistor the LED would run at too high of a voltage, resulting in too
much current being drawn which in turn would instantly burn the LED, and
likely also the GPIO port.
To calculate the resistor value we need to examine the
specifications of the LED. Specifically we need to find the forward
voltage (VF) and the forward current (IF).