Skip to main content

23.AWS-RedShift

23.AWS-RedShift:


Database::
RDS----------DB -OLTP(online transcation.process)-Application writting  to DB (RegularDB Transction)
DynamoDB
ElastiCache
Neptune
Amazon Redshift-----DB - OLAP(Online analytic Process)..like filpKart,amzone..
.........................
Amazon Redshift:::
goole- aws redshift usee case philips, google- transfering Mysql to redshift,
.Clusters-( Subnet groups create ) - Lanch cluster-cluteridentifer,name,username,pass,-Continue-selct nodetype-Continou-(one machine is started we cant terminated but stoped)
 -selcet VPC -subnetgroup(in dash board intially created)- Public accessible yes but real time no-securitygroup- continue-launch cluster-view all cluster-wait 10 mintus..-
 -Getting Endpoint-(17.07)-

.google -sql work bench for redshift-java required-oprn SQL workbench-Manage Drivers-(in aws go in that connet client)-hear also i can downlode SQL Work bench,
-JDBC Driver()JDBC4.2),downlode-select cluster-,,
.-then go WorkBench-file-connectwindow-Managedrivers-selectjarfile to give that downloded jdbc drive location,- give name- ok-ok,
.-giveprofilename-url( JDBC Url),give usename and pass of redshifts,connectscripts-(select version();ideal time 15), - ok
 -autocommit-
.google -readshift sample data-

https://s3.amazonaws.com/awssampledb/LoadingDataSampleFiles.zip
https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html
https://docs.aws.amazon.com/redshift/latest/dg/tutorial-loading-run-copy.html#tutorial-loading-run-copy-replaceables
...
.....................................
..go -S3-Create bucket-inbucket create folder-downlode datasample abovlink and uplode to this folder-
..Go IAM-Create user- S3fullaccess-credential-

..If -redShiftup-openworkbench-enter url JDBCurl-username,pass-Test-connection is sucess-

 -ok- runscripts for any tables is ther in DB, run-see message-addtab-simmlry test(33.35)
-..now creating table -Run scripts- simmlary other tables also created...
  ..now run script for anny tables created.. yes having tables..now ..i.e we created tables in DB
 

.. Now i am loading data from S3  To DB in that part table......Via  using ..SQL Workbench..
 connect-now using s3 bucketURN,IAM UserCrediantials we write scrip and Executed in SL Work Bench...
.....................
Now CReating Reports.............
google--tableau,(42.51),powerbi desktop,

.powerbi desktop-downode-open-Getdata-AmazoneReadShift- server(endpoint),Database(name),-DirectQuery-ok.- username and pass..then connect..
 -selct yur folder table-load-....
clusters-delet cluster-no snapre-delet...
.....video..51..25........

..ElastiCash:-- Redis,Memcashed;(for Ranver And Dipika marrage..like intraram,f.b cashing that data for theit websites..irtc tatkal ticket,,befor kepping cashing..) google- instagram redis,


..Neptune::--Graph Data Base.

-----------------------------------------------------------------------------------------------------------------------------
--------------------------------------------------------------------------------------------------------------------------


https://s3.amazonaws.com/awssampledb/LoadingDataSampleFiles.zip

https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html


https://docs.aws.amazon.com/redshift/latest/dg/tutorial-loading-run-copy.html#tutorial-loading-run-copy-replaceables

power bi desktop



create table myevent(
eventid int,
eventname varchar(200),
eventcity varchar(30));

out put:
Table myevent created

Execution time: 0.28s



accesing s3 to readshift db


copy part from 's3://<your-bucket-name>/load/part-csv.tbl'
credentials 'aws_access_key_id=<Your-Access-Key-ID>;aws_secret_access_key=<Your-Secret-Access-Key>'
csv
null as '\000';
------------------------------------------------------------------------------------------------------------------------------------
























































































Comments

Popular posts from this blog

36.Migration & Transfer

36.Migration & Transfer Migration & Transfer:::...... .............................................. 1.Snowball::(data moving only,like box, 10gb connectivety, 80terrabytes supportes one box,encripted for portable purpouse ,aws snow mobi- like truck 40gb connectivety bunch of snow balles in side truck, we use this services for not having enf bandwidth, snowball edge.study dock) 2.Server Migration Service:: ( virtual servers not for physical servers, free tool,  google search aws sms limits, onprimisess to cloude, to over com use 3rd party tools lik platespin migrate aws,cloud endure,zerto aws )  15.54 .. video harshas desktop..(google - aws sms user guide,planning diffuclt but implementation easy,  google - aws 6r , acess key and security key required,replication,finally creating AMI) 3.Database Migration Service(crating one instnce to take replication from on premisiss to stor in aws RDS,or more given directions,trasfer over vpn) 4.AWS Migration...

42-AWS-PROJECT-CERTIFATION

42-AWS-PROJECT-CERTIFATION ... GOOGLE....aws 6r..... 1. 2. 3. 4. 5. 6. PRE SALE -POST SALE::: DPR: Detailed PROJECT REPORT. RFI : REQUSTE FOR INF. RFP : REQUEST FOR PROPEROSAL RFQ : REQ     FOR QOOTE POC: PROOF OF CONCEPT HLD: HIGH LEVEL DEGINE  CEO,LEVEL,  HIG LEVEL DIAGROM LLD: LOW LEVEL DEGINE  .. AWS , VPC, ...... BUILD SHETT: FULL DETIALS OF IP NO.. ALL PIN TO PIN UAT: USER ACCEPTANCEY TEST ORT: OPERATION REDINESS TEST SING OFF : REMAINING AMOUT COLLECT CLOSE ARCHITECTURE: AWS CERTIFICATION: Jayandra Patil AWS .............. AWS sysops bluprint

43-Dev-git cmds

43-Dev-git cmds Harsha Veerapalli... .git clone https://github.com/username/repository  git clone https://github.com/NAVEENMJ/1 git status .git init .git status ... getting red cloure  ( a.txt ) .git add file.txt or  (git add .) .git push .git push -u origin master .................. .git branch ---list of branches .git checkout -b branch ...Creaing branches .git merge branch ... presnt in master then merge.. ................................... .git status .git init .git status ... getting red cloure  ( a.txt ) .git add file.txt .git status .... getting green cloure  ( new file: a.txt) [if multipull files in folder use git add .] .git commit -m "COMMIT-1" .. Hear COMMIT-1 Means giving name in genralli changed  name given .git log ......changes ...times.. modifi that file a.txt...... .git status .git add . .git status .git commit -m "COMMIT-2" .cls .git log ..........i want go before comited version....