GeForce 9800 GTX vs GeForce GTX 660M
Find out if it is worth upgrading your current GPU setup by comparing GeForce 9800 GTX and GeForce GTX 660M. Here you can take a closer look at graphics cards specs, such as core clock speed, memory type and size, display connectors, etc. The price, overall benchmark and gaming performances are usually defining factors when it comes to choosing between GeForce 9800 GTX and GeForce GTX 660M. Make sure that the graphics card has compatible dimensions and will properly fit in your new or current computer case. Also these graphics cards may have different system power recommendations, so take that into consideration and upgrade your PSU if necessary.
GeForce 9800 GTX
GeForce GTX 660M is a Laptop Graphics Card
Note: GeForce GTX 660M is only used in laptop graphics. It has lower GPU clock speed compared to the desktop variant, which results in lower power consumption, but also 10-30% lower gaming performance. Check available laptop models with GeForce GTX 660M here:
Main Specs
GeForce 9800 GTX | GeForce GTX 660M | |
Power consumption (TDP) | 140 Watt | 50 Watt |
Interface | PCIe 2.0 x16 | MXM-B (3.0) |
Supplementary power connectors | 2x 6-pin | |
Memory type | GDDR3 | GDDR5 |
Maximum RAM amount | 512 MB | 2 GB |
Display Connectors | 2x DVI, 1x S-Video | No outputs |
Check Price |
- GeForce 9800 GTX has 180% more power consumption, than GeForce GTX 660M.
- GeForce 9800 GTX is connected by PCIe 2.0 x16, and GeForce GTX 660M uses MXM-B (3.0) interface.
- GeForce 9800 GTX has 510 GB more memory, than GeForce GTX 660M.
- GeForce 9800 GTX is used in Desktops, and GeForce GTX 660M - in Laptops.
- GeForce 9800 GTX is build with Tesla architecture, and GeForce GTX 660M - with Kepler.
- Core clock speed of GeForce 9800 GTX is 853 MHz higher, than GeForce GTX 660M.
- GeForce 9800 GTX is manufactured by 65 nm process technology, and GeForce GTX 660M - by 28 nm process technology.
- Memory clock speed of GeForce GTX 660M is 900 MHz higher, than GeForce 9800 GTX.
Game benchmarks
high / 1080p | 0−1 | 6−7 |
ultra / 1080p | − | 4−5 |
QHD / 1440p | 0−1 | 0−1 |
low / 720p | 1−2 | 16−18 |
medium / 1080p | 0−1 | 8−9 |
The average gaming FPS of GeForce GTX 660M in Assassin's Creed Odyssey is 1600% more, than GeForce 9800 GTX. | ||
high / 1080p | − | 12−14 |
ultra / 1080p | − | 10−11 |
QHD / 1440p | 0−1 | 0−1 |
low / 720p | 0−1 | 24−27 |
medium / 1080p | − | 12−14 |
low / 768p | 50−55 | 45−50 |
high / 1080p | 45−50 | 45−50 |
QHD / 1440p | 0−1 | 0−1 |
The average gaming FPS of GeForce 9800 GTX in Call of Duty: Warzone is 6% more, than GeForce GTX 660M. | ||
low / 768p | 60−65 | 130−140 |
medium / 768p | 27−30 | 110−120 |
ultra / 1080p | 7−8 | 50−55 |
QHD / 1440p | − | 27−30 |
4K / 2160p | − | 27−30 |
high / 768p | 16−18 | 80−85 |
The average gaming FPS of GeForce GTX 660M in Counter-Strike: Global Offensive is 242% more, than GeForce 9800 GTX. | ||
low / 768p | 70−75 | 55−60 |
ultra / 1080p | 0−1 | 18−20 |
medium / 1080p | 45−50 | 45−50 |
The average gaming FPS of GeForce 9800 GTX in Cyberpunk 2077 is 15% more, than GeForce GTX 660M. | ||
low / 768p | 45−50 | 80−85 |
medium / 768p | 10−11 | 55−60 |
ultra / 1080p | 0−1 | 30−35 |
The average gaming FPS of GeForce GTX 660M in Dota 2 is 141% more, than GeForce 9800 GTX. | ||
high / 1080p | − | 8−9 |
ultra / 1080p | − | 7−8 |
4K / 2160p | − | 3−4 |
low / 720p | 0−1 | 18−20 |
medium / 1080p | − | 8−9 |
high / 1080p | − | 14−16 |
ultra / 1080p | − | 10−11 |
low / 720p | 21−24 | 60−65 |
medium / 1080p | 0−1 | 21−24 |
The average gaming FPS of GeForce GTX 660M in Fortnite is 181% more, than GeForce 9800 GTX. | ||
high / 1080p | 0−1 | 12−14 |
ultra / 1080p | − | 10−12 |
QHD / 1440p | 0−1 | 1−2 |
low / 720p | 0−1 | 24−27 |
medium / 1080p | 0−1 | 14−16 |
low / 768p | 18−20 | 50−55 |
medium / 768p | − | 45−50 |
high / 1080p | 0−1 | 12−14 |
ultra / 1080p | − | 6−7 |
QHD / 1440p | 0−1 | 0−1 |
medium / 720p | 12−14 | − |
The average gaming FPS of GeForce GTX 660M in Grand Theft Auto V is 173% more, than GeForce 9800 GTX. | ||
high / 1080p | − | 4−5 |
ultra / 1080p | − | 3−4 |
4K / 2160p | − | 0−1 |
low / 720p | 0−1 | 12−14 |
medium / 1080p | − | 6−7 |
low / 768p | 75−80 | 95−100 |
high / 1080p | 27−30 | 90−95 |
ultra / 1080p | − | 80−85 |
medium / 1080p | − | 90−95 |
The average gaming FPS of GeForce GTX 660M in Minecraft is 79% more, than GeForce 9800 GTX. | ||
high / 1080p | − | 16−18 |
ultra / 1080p | − | 14−16 |
low / 720p | 8−9 | 30−35 |
medium / 1080p | − | 18−20 |
The average gaming FPS of GeForce GTX 660M in PLAYERUNKNOWN'S BATTLEGROUNDS is 300% more, than GeForce 9800 GTX. | ||
ultra / 1080p | − | 7−8 |
QHD / 1440p | − | 0−1 |
low / 720p | 0−1 | 12−14 |
medium / 1080p | − | 10−12 |
low / 768p | 0−1 | 24−27 |
medium / 768p | − | 16−18 |
high / 1080p | − | 9−10 |
ultra / 1080p | − | 6−7 |
4K / 2160p | − | 6−7 |
low / 768p | 45−50 | 85−90 |
medium / 768p | 14−16 | 45−50 |
ultra / 1080p | 0−1 | 18−20 |
high / 768p | 12−14 | 35−40 |
The average gaming FPS of GeForce GTX 660M in World of Tanks is 128% more, than GeForce 9800 GTX. |
Full Specs
GeForce 9800 GTX | GeForce GTX 660M | |
Architecture | Tesla | Kepler |
Code name | G92 | N13E-GE |
Type | Desktop | Laptop |
Release date | 28 March 2008 | 22 March 2012 |
Pipelines | 128 | 384 |
Core clock speed | 1688 MHz | 835 MHz |
Boost Clock | 950 MHz | |
Transistor count | 754 million | 1,270 million |
Manufacturing process technology | 65 nm | 28 nm |
Texture fill rate | 43.2 billion/sec | 30.4 billion/sec |
Floating-point performance | 432.1 gflops | 729.6 gflops |
Length | 10.5" (26.7 cm) | |
Memory bus width | 256 Bit | 128bit |
Memory clock speed | 1100 MHz | 2000 MHz |
Memory bandwidth | 70.4 GB/s | 64.0 GB/s |
Shared memory | - | |
DirectX | 11.1 (10_0) | 12 API |
Shader Model | 4.0 | 5.1 |
OpenGL | 2.1 | 4.5 |
OpenCL | 1.1 | 1.1 |
Vulkan | N/A | 1.1.126 |
CUDA | + | + |
CUDA cores | 128 | 384 |
Bus support | PCI-E 2.0 | PCI Express 2.0, PCI Express 3.0 |
Height | 4.376" (11.1 cm) | |
SLI options | + | + |
Multi monitor support | + | |
HDMI | Via Adapter | + |
HDCP | + | |
Maximum VGA resolution | 2048x1536 | Up to 2048x1536 |
Audio input for HDMI | S/PDIF | |
Bitcoin / BTC (SHA256) | 36 Mh/s | |
Laptop size | large | |
Optimus | + | |
Check Price |