google cloud platform - Data missing from my new CloudSQL read replica following restore on the master

I just had a colleague blow away a table by accidentally running a unit test in an environment with a real database (which is a good opportunity to add a sanity check to that particular piece of code ;- )No problem, I restored my database from backup using instructions here.The problem is that my newly created read replica is lacking the restored data.The data IS present in the master, it's just missing in the new read replica (yes I deleted the read replica before restoring on the master)...Read more

Why IN_CLOSE_NOWRITE on Google Compute Engine w/Debian Stretch and pyinotify?

I tried setting up Fail2Ban on Google Cloud Compute Engine virtual machines based on stock Google images. Fail2Ban installs with apt-get and runs fine but the rules never fire. When I verify the rules with fail2ban-regex the rules do match lines in the actual log files.After some investigation I found out that if I run pyinotify on a file that I change myself I see:<Event dir=False mask=0x20 maskname=IN_OPEN name='' path=/tmp/test pathname=/tmp/test wd=1 ><Event dir=False mask=0x2 maskname=IN_MODIFY name='' path=/tmp/test pathname=/tmp...Read more

data science - Google Cloud Datalab - how to create a VM for training using by multiple users

I am new with Datalab, our team of data scientist asked us to have 5 individual VM for each of them, and another VM for Training which should be accessed by any of them. I have searched a lot and in documents of google it is written: “Cloud Datalab instances are single-user environments, therefore each member of your team needs their own instance.”So, my question is, what is the best practice for having a training Datalab machine and how I can give access to it to multiple users?...Read more

Google Cloud Datalab Minimum System Requirements

Is it possible to create a google cloud datalab with f1-micro and 20GB of boot disk and 10GB of persistent disk?Or is the minimum requirement, the default of n1-standard-1 and 200GB Standard Persistent Disk?I tried to create a datalab instance with the following command:datalab create --image-name practice --disk-size-gb 10 --idle-timeout "30m" --machine-type f1-micro practiceAlthough the VM is created, the datalab gets stuck at waiting for datalab to be available at localhost.It works when I go with the default command of datalab create practi...Read more

google cloud platform - Data Prep preprocess to Ml Engine

Say I have code on App Engine reading Gmail attachments, parsing that it goes to Cloud Data Store, through Data Prep recipes and steps, stored back into Data Store, then predicted on by ML Engine Tensorflow model?Reference:Is this all achievable through Dataflow?EDIT 1:Is it possible to export the Data Prep steps and use them as preprocessing before an Ml Engine Tensorflow model?...Read more

Looking to obtain file information using Google Cloud functions.

I am attempting to gather some data on some files that I am receiving, I am looking for File size, number of records, date file received. These are .CSV files. At this point I have a function that will take a file that lands in the Bucket and pick up this file and move it to a new bucket. I am assuming that there are some sort of commands that can get this metadata from the file I am already grabbing. Ultimately I would like to write these results to a file, or table or an e-mail every time this file arrives. A note this is my very first G...Read more

Selecting google cloud tool for executing demanding python script

Where should I execute a python script that process ~7giga of data that is available on GCS. The output will be writen to GCS as well.The script was debugged on datalab notebook with small dataset. I would like to scale up the processing. Should I allocate a big machine? I have no idea what size (resources) of machine is needed.Many thanks,EilaJust in case, Dataflow can’t work for that kind of data processing...Read more

google cloud platform - datalab SSH key ERROR: (gcloud.compute.ssh) [/usr/bin/ssh]

Recently when I try to connect to my datalab VM from cloud shell, i see the following error even if I am Project Owner: Connecting to datalab-vm1-1. This will create an SSH tunnel and may prompt you to create an rsa key pair. To manage these keys, see https://cloud.google.com/compute/docs/instances/adding-removing-ssh-keys Waiting for Datalab to be reachable at http://localhost:8081/ Permission denied (publickey). ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255]. Connection broken Attempting to reconnect...Any ide...Read more

google cloud platform - why am I able to SSH from command line, but not from `datalab connect`?

I've been playing with google datalab and it's hard to get a connection to the notebookI can create/launch an instance successfuly but usually the notebook is unavailable$ datalab create [instance]Connecting to [instance].This will create an SSH tunnel and may prompt you to create an rsa key pair. To manage these keys, see https://cloud.google.com/compute/docs/instances/adding-removing-ssh-keysWaiting for Datalab to be reachable at http://localhost:8081/ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255].Connection brokenAt...Read more

Cannot create image with packer in Google Cloud due to service account error

I have created a service account in Gcloud.Installed Gcloud on my mac.When ever I run my packer template, it complains about this account which I have no idea where it is coming from.Packer template:{ "builders": [ { "type": "googlecompute", "account_file": "/Users/Joe/Downloads/account.json", "project_id": "rare-truck-123456", "source_image": "centos-7-v20180129", "zone": "us-west1-a", "ssh_username": "centos" } ]}Error:➜ packer git:(master) ✗ packer build release_google_image.jsongooglecompute output wil...Read more

google cloud platform - Custom 301 redirects for static site using GCP Storage

I am considering whether to host my static website on an AWS S3 bucket or using Google Cloud Platform Storage, and, as I already use GCP and with it being generally cheaper, I would really like to utilize that option.My issue is that I often need to create custom 301 redirects for my site, like:https://example.com/page -> https://anotherexample.com/another-pageS3 seems to handle this well, but I'm not finding any documentation on custom redirects from GCP.Is this possible with GCP Storage buckets yet?...Read more