Skip to main content

23.AWS-RedShift

23.AWS-RedShift:


Database::
RDS----------DB -OLTP(online transcation.process)-Application writting  to DB (RegularDB Transction)
DynamoDB
ElastiCache
Neptune
Amazon Redshift-----DB - OLAP(Online analytic Process)..like filpKart,amzone..
.........................
Amazon Redshift:::
goole- aws redshift usee case philips, google- transfering Mysql to redshift,
.Clusters-( Subnet groups create ) - Lanch cluster-cluteridentifer,name,username,pass,-Continue-selct nodetype-Continou-(one machine is started we cant terminated but stoped)
 -selcet VPC -subnetgroup(in dash board intially created)- Public accessible yes but real time no-securitygroup- continue-launch cluster-view all cluster-wait 10 mintus..-
 -Getting Endpoint-(17.07)-

.google -sql work bench for redshift-java required-oprn SQL workbench-Manage Drivers-(in aws go in that connet client)-hear also i can downlode SQL Work bench,
-JDBC Driver()JDBC4.2),downlode-select cluster-,,
.-then go WorkBench-file-connectwindow-Managedrivers-selectjarfile to give that downloded jdbc drive location,- give name- ok-ok,
.-giveprofilename-url( JDBC Url),give usename and pass of redshifts,connectscripts-(select version();ideal time 15), - ok
 -autocommit-
.google -readshift sample data-

https://s3.amazonaws.com/awssampledb/LoadingDataSampleFiles.zip
https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html
https://docs.aws.amazon.com/redshift/latest/dg/tutorial-loading-run-copy.html#tutorial-loading-run-copy-replaceables
...
.....................................
..go -S3-Create bucket-inbucket create folder-downlode datasample abovlink and uplode to this folder-
..Go IAM-Create user- S3fullaccess-credential-

..If -redShiftup-openworkbench-enter url JDBCurl-username,pass-Test-connection is sucess-

 -ok- runscripts for any tables is ther in DB, run-see message-addtab-simmlry test(33.35)
-..now creating table -Run scripts- simmlary other tables also created...
  ..now run script for anny tables created.. yes having tables..now ..i.e we created tables in DB
 

.. Now i am loading data from S3  To DB in that part table......Via  using ..SQL Workbench..
 connect-now using s3 bucketURN,IAM UserCrediantials we write scrip and Executed in SL Work Bench...
.....................
Now CReating Reports.............
google--tableau,(42.51),powerbi desktop,

.powerbi desktop-downode-open-Getdata-AmazoneReadShift- server(endpoint),Database(name),-DirectQuery-ok.- username and pass..then connect..
 -selct yur folder table-load-....
clusters-delet cluster-no snapre-delet...
.....video..51..25........

..ElastiCash:-- Redis,Memcashed;(for Ranver And Dipika marrage..like intraram,f.b cashing that data for theit websites..irtc tatkal ticket,,befor kepping cashing..) google- instagram redis,


..Neptune::--Graph Data Base.

-----------------------------------------------------------------------------------------------------------------------------
--------------------------------------------------------------------------------------------------------------------------


https://s3.amazonaws.com/awssampledb/LoadingDataSampleFiles.zip

https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html


https://docs.aws.amazon.com/redshift/latest/dg/tutorial-loading-run-copy.html#tutorial-loading-run-copy-replaceables

power bi desktop



create table myevent(
eventid int,
eventname varchar(200),
eventcity varchar(30));

out put:
Table myevent created

Execution time: 0.28s



accesing s3 to readshift db


copy part from 's3://<your-bucket-name>/load/part-csv.tbl'
credentials 'aws_access_key_id=<Your-Access-Key-ID>;aws_secret_access_key=<Your-Secret-Access-Key>'
csv
null as '\000';
------------------------------------------------------------------------------------------------------------------------------------
























































































Comments

Popular posts from this blog

43-Dev-git cmds

43-Dev-git cmds Harsha Veerapalli... .git clone https://github.com/username/repository  git clone https://github.com/NAVEENMJ/1 git status .git init .git status ... getting red cloure  ( a.txt ) .git add file.txt or  (git add .) .git push .git push -u origin master .................. .git branch ---list of branches .git checkout -b branch ...Creaing branches .git merge branch ... presnt in master then merge.. ................................... .git status .git init .git status ... getting red cloure  ( a.txt ) .git add file.txt .git status .... getting green cloure  ( new file: a.txt) [if multipull files in folder use git add .] .git commit -m "COMMIT-1" .. Hear COMMIT-1 Means giving name in genralli changed  name given .git log ......changes ...times.. modifi that file a.txt...... .git status .git add . .git status .git commit -m "COMMIT-2" .cls .git log ..........i want go before comited version....

Azure Devops tutorial

 Azure Devops tutorial: web sit-  Projects - Home (azure.com) AZURE DEVOPS - Organization - Projects Under project- --------- Under Pipelines ---> Pipelines( CI ) -  Under Pipelines --->Releases( CD ) -  ------------------------------------------------------------------------------------------------------------------------- Azure Pipelines:--- Pipeline structure:- A pipeline is one or more stages that describe a CI/CD process. Stages are the major divisions in a pipeline. The stages "Build this app," "Run these tests," and "Deploy to preproduction" are good examples. A stage is one or more jobs, which are units of work assignable to the same machine. You can arrange both stages and jobs into dependency graphs. Examples include "Run this stage before that one" and "This job depends on the output of that job." A job is a linear series of steps. Steps can be tasks, scripts, or references to external templates. This hierarchy is refle...

42-AWS-PROJECT-CERTIFATION

42-AWS-PROJECT-CERTIFATION ... GOOGLE....aws 6r..... 1. 2. 3. 4. 5. 6. PRE SALE -POST SALE::: DPR: Detailed PROJECT REPORT. RFI : REQUSTE FOR INF. RFP : REQUEST FOR PROPEROSAL RFQ : REQ     FOR QOOTE POC: PROOF OF CONCEPT HLD: HIGH LEVEL DEGINE  CEO,LEVEL,  HIG LEVEL DIAGROM LLD: LOW LEVEL DEGINE  .. AWS , VPC, ...... BUILD SHETT: FULL DETIALS OF IP NO.. ALL PIN TO PIN UAT: USER ACCEPTANCEY TEST ORT: OPERATION REDINESS TEST SING OFF : REMAINING AMOUT COLLECT CLOSE ARCHITECTURE: AWS CERTIFICATION: Jayandra Patil AWS .............. AWS sysops bluprint