jupyter notebook - Manage google dataproc preemptible-workers persistent disk size -
i'm using jupyter on cluster created using google dataproc , working well.
i tried change cluster "size" (machine type, boot disk size, number of workers, ...) fit needs, , working pretty well.
the main issue don't how change persistent disk size preemptible-workers.
i'm using command:
gcloud dataproc clusters create jupyter --project <my_project> --initialization-actions gs://dataproc-initialization-actions/jupyter/jupyter.sh --num-preemptible-workers 0 --master-boot-disk-size 25 --worker-boot-disk-size 10 --worker-machine-type n1-standard-1 --worker-boot-disk-size 10
i hoped "--worker-boot-disk-size 10
" option have been applied preemptible ones, did not.
so, there way change preemptible-workers boot disk size?
furthermore, google charge me preemtible worker persistent disk usage?
the beta dataproc gcloud channel offers --preemptible-worker-boot-disk-size sounds thing want.
for example:
gcloud beta dataproc clusters create ... --preemptible-worker-boot-disk-size 500gb
announced here: https://cloud.google.com/dataproc/release-notes/service#july_1_2016
as of september_16_2016 release, --preemptible-worker-boot-disk-size can used without creating preemptible vms: https://cloud.google.com/dataproc/docs/release-notes/service#september_16_2016
Comments
Post a Comment