inf1.xlarge

The inf1.xlarge instance is in the Machine Learning ASIC Instances family with 4 vCPUs, 8 GiB of memory and up to 25 Gibps of bandwidth starting at $0.228 per hour.

Pricing

$0.228

On Demand

$0.046

Spot

$0.144

1-Year Reserved

$0.11

3-Year Reserved

Request a demo

Family Sizes

SizevCPUsMemory (GiB)
inf1.xlarge48
inf1.2xlarge816
inf1.6xlarge2448
inf1.24xlarge96192

Instance Variants

Variant
inf1
inf2

Having trouble making sense of your EC2 costs? Check out cur.vantage.sh for an AWS billing code lookup tool.

Instance Details

ComputeValue
vCPUs4
Memory (GiB)8
Memory per vCPU (GiB)2
Physical ProcessorIntel Xeon Platinum 8275CL (Cascade Lake)
Clock Speed (GHz)N/A
CPU Architecturex86_64
GPU
1
GPU Average Wattage0 W
GPU Architecture
AWS Inferentia
Video Memory (GiB)8
GPU Compute Capability ?0
FPGA
0
ffmpeg FPS26
CoreMark iterations/Second65773.246732
NUMA ArchitectureValue
Uses NUMA Architecture ?No
NUMA Node CountN/A
Max NUMA DistanceN/A
Cores per NUMA Node (Avg)N/A
Threads per NUMA Node (Avg)N/A
Memory per NUMA Node (Avg MB)N/A
L3 Cache per NUMA Node (Avg MB)N/A
L3 Cache SharedN/A
NetworkingValue
Network Performance (Gibps)up to 25
Enhanced Networking
true
IPv6
true
Placement Group ?
Trunking Compatible
true
Branch Interface38
StorageValue
EBS Optimized
true
Max Bandwidth (Mbps) on EBS4750
Max Throughput (MB/s) on EBS593.75
Max I/O operations/second IOPS20000
Baseline Bandwidth (Mbps) on EBS1190
Baseline Throughput (MB/s) on EBS148.75
Baseline I/O operations/second IOPS4000
Devices0
Swap Partition
false
NVME Drive ?
false
Disk Space (GiB)0
SSD
false
Initialize Storage
false
Instance Store Read IOPSN/A
Instance Store Write IOPSN/A
AmazonValue
Generation
current
Instance Typeinf1.xlarge
Bare Metal
false
FamilyMachine Learning ASIC Instances
NameINF1 Extra Large
Elastic Map Reduce EMR
false