Transferring Files from local system to Google Cloud Platform
Google Cloud Platform is a suite of cloud computing services that runs on the same infrastructure that Google uses internally for its end-user products, such as Google Search, Gmail, and YouTube (Source: [Wikipedia]). It is written in Java, C++, Python, Go and Ruby. Have a look at the below screenshot how it looks like.
Creating Google Cloud Platform Account
- This is a very simple process. Please visit the next mentioned link https://cloud.google.com/ and click on the console on the top right corner. If facing any problem please have a look at the below screenshot.
- Once you click on the console, you will be asked to sign in. Use your Google account to sign in. Add billing details. You will also get some credit amount when you create the account for the first time. Have a look at the below screenshot to get a better idea
Transferring files from local system into google cloud platform
Step 1: Install gcloud on the local system. Please follow the below article to install SDK into the local system.
Step 2: Type in the terminal
gcloud init
You will be prompted to visit a link that will present you with a key to copy and paste into the command line. To do it successfully, you must be logged in to your Google Cloud Platform account on the browser where the key is retrieved. Once you’ve entered the key into the command line, you can then select the project you want to work with or create a new project.
Step 3: Please type the below command
gcloud compute ssh instance-1
where instance-1is the name of instance. One can find the name of the instance from the google cloud platform under the project which has been selected. Have a look at the below screenshot
Step 4: On successful creation of the step 3, GCP keys will be stored in the local system under the folder .ssh where the private key in PEM format is to be added to FileZilla. FileZilla is software to transfer the data from the local system to cloud platforms such as GCP and Amazon. Read and download filezilla using the below link. Also have a look at the files inside the folder .ssh
Note: I am making the article keeping in consideration Ubuntu. Maybe there is some change in steps if the operating system changes,
Step 5: Open Filezilla and Go to edit> settings and then to SFTP under connection. Just add the .pem key here which one gets from step 4. Have a look at the below screenshot to understand things better
Step 6: Everything is set now. Now in the Filezilla main screen add host as below
Host: sftp://instance_ipaddress
[To get the instance ip address Go to google cloud platform and then click on the compute engine and then click on VM instances]. One will find the external IP from there, If anyone faces any problem, please see the below screenshot.
Username: It is the username of the instance. In step 4, on success, you will get connected to the instance locally and one can get the username using the below command on the terminal
whoami
Great. When you click on the connect. Connection is successful and you can transfer the data.
Conclusion
I hope readers have enjoyed reading the above article. Readers can also refer to one of an article on connecting Facebook messenger with google dialog flow from the below link
Connecting DialogFlow with Facebook Application | Messenger Chatbot
One can also ask questions from either comment section or email me at successindeed358@gmail.com
In a nutshell, I would like to conclude that if readers will do practical with this written tutorial it will help them to understand better because only reading is not sufficient at all. Here are some of the additional links where readers can see and catch me.
Some Additional Notes
Sometimes users need to access google cloud shell from the local terminal. To achieve this implementation one can use ssh. Please see the below screenshot how to use this command
Creating key for compute Engine default Service Account
I am attaching the below screenshot to let readers know how to create key for compute engine default service account
For getting cheatsheet of Google cloud please visit the link Google Cloud Cheatsheet Information related to project name and id is, “One can change the project name not the project id and project number”. Have a look at below screenshot to understand how to download a file from the gcloud shell to local system
Google Gcloud command line basic syntax
Saving log in bucket from virtual machine | Google Cloud Platform
GCP Identity Access Management can be understood very easy from the below image
Learn about IP address and subnet masking
PODS Communicating with each other
Persistent Volume and Persistent Volume Claim
Deploy Code to Kubernetes
Deployment and Rolling Updates
Let us see about some analogies between amazon web service and google cloud
Case 1: Storage
Case 2: Instances Clusters and Nodes | GCP vs Amazon Web Services
Case 3: AI/ML Services Amazon vs GCP
There is some important information that I am sharing with the readers. Whenever one needs to analyze the logs the GCP flow is as below
- Logs are first ingested using Stackdriver logging.
- These are then stored in the cloud storage.
- These are passed to cloud pub/sub
- These are then passed to the Dataflow which uses Apache Beam Framework to process streaming data
I had mentioned this point because today’s main source to enhance business is to analyze the data and google via GCP is doing it both smartly and securely.
Call to Action
I would like to thanks readers for reading this article and also hope that they gained some useful insights with respect to their understanding of google cloud. I would like to mention some names because I had taken their lectures on udemy which helped me in making some additional notes. These are Mattias Andersson, Nigel Poulton and Ryan Kroonenburg. I would also urge the readers to take their valuable and worthful course Google Associate Cloud Engineer course on Udemy
If you need to ask any question with respect to writing this article, you may send your query to successindeed358@gmail.com. Again thanks a lot for sparing some time to read this or feel free to ask in the comment section.
Originally published at http://ersanpreet.wordpress.com on July 13, 2020.