jupyter notebook - Manage google dataproc preemptible-workers persistent disk size -


i'm using jupyter on cluster created using google dataproc , working well.

i tried change cluster "size" (machine type, boot disk size, number of workers, ...) fit needs, , working pretty well.

the main issue don't how change persistent disk size preemptible-workers.

i'm using command:

gcloud dataproc clusters create jupyter --project <my_project>   --initialization-actions gs://dataproc-initialization-actions/jupyter/jupyter.sh --num-preemptible-workers 0 --master-boot-disk-size 25 --worker-boot-disk-size 10  --worker-machine-type n1-standard-1 --worker-boot-disk-size 10 

i hoped "--worker-boot-disk-size 10" option have been applied preemptible ones, did not.

so, there way change preemptible-workers boot disk size?

furthermore, google charge me preemtible worker persistent disk usage?

the beta dataproc gcloud channel offers --preemptible-worker-boot-disk-size sounds thing want.

for example:

gcloud beta dataproc clusters create ... --preemptible-worker-boot-disk-size 500gb 

announced here: https://cloud.google.com/dataproc/release-notes/service#july_1_2016

as of september_16_2016 release, --preemptible-worker-boot-disk-size can used without creating preemptible vms: https://cloud.google.com/dataproc/docs/release-notes/service#september_16_2016


Comments

Popular posts from this blog

magento2 - Magento 2 admin grid add filter to collection -

Android volley - avoid multiple requests of the same kind to the server? -

Combining PHP Registration and Login into one class with multiple functions in one PHP file -