GCP Compute Instance
n1-highmem-16
The n1-highmem-16 is a general-purpose GCP Compute instance with 16 vCPUs and 104 GB of memory, powered by Intel. Well-suited for web servers, mid-size databases, and backend applications. On-demand pricing starts at $0.9464/hr in us-central1.
Price History
Pricing by Region
| Region | On-Demand ↑ | Spot | Reserved 1Y |
|---|---|---|---|
| us-central1 | $0.946 | $0.346 | — |
| us-east1 | $0.946 | $0.346 | — |
| us-west1 | $0.946 | $0.346 | — |
| me-west1 | $1.041 | $0.293 | — |
| europe-west1 | $1.041 | $0.296 | — |
| europe-west4 | $1.042 | $0.413 | — |
| northamerica-northeast1 | $1.042 | $0.495 | — |
| northamerica-northeast2 | $1.042 | $0.347 | — |
| europe-north1 | $1.042 | $0.344 | — |
| us-east4 | $1.066 | $0.232 | — |
| us-west4 | $1.066 | $0.321 | — |
| asia-east1 | $1.096 | $0.205 | — |
| asia-south1 | $1.137 | $0.239 | — |
| asia-south2 | $1.137 | $0.217 | — |
| us-west2 | $1.137 | $0.439 | — |
| us-west3 | $1.137 | $0.346 | — |
| europe-central2 | $1.145 | $0.225 | — |
| asia-southeast1 | $1.167 | $0.202 | — |
| asia-northeast1 | $1.213 | $0.348 | — |
| asia-northeast2 | $1.213 | $0.432 | — |
| asia-northeast3 | $1.213 | $0.412 | — |
| europe-west2 | $1.219 | $0.251 | — |
| europe-west3 | $1.219 | $0.321 | — |
| europe-west6 | $1.240 | $0.356 | — |
| asia-southeast2 | $1.273 | $0.223 | — |
| asia-east2 | $1.324 | $0.317 | — |
| australia-southeast1 | $1.343 | $0.389 | — |
| australia-southeast2 | $1.343 | $0.193 | — |
| southamerica-east1 | $1.502 | $0.260 | — |
Frequently Asked Questions
- What is the n1-highmem-16?
- The n1-highmem-16 is a general-purpose GCP Compute instance with 16 vCPUs and 104 GB of memory, powered by Intel. Well-suited for web servers, mid-size databases, and backend applications. On-demand pricing starts at $0.9464/hr in us-central1.
- What is the cheapest region for n1-highmem-16?
- us-central1 at $0.946/hr on-demand.
- How much does n1-highmem-16 cost per month?
- Starting at $690.89/mo on-demand in us-central1, or $140.78/mo with spot pricing.