Kepler lineup

Discuss anything you like on this forum.
Post Reply
User avatar
Refracty
Licensed Customer
Posts: 1599
Joined: Wed Dec 01, 2010 6:42 pm
Location: 3D-Visualisierung Köln
Contact:

matej wrote:Then an official warning should be in order: don't rush to the store for a 680 just for the sake of Octane, cause it wont work still for X months
+1
Timmaigh
Licensed Customer
Posts: 168
Joined: Mon Nov 01, 2010 9:52 pm

Just gave a thorough read to gtx680 review on Anandtech and finally i am starting to comprehend, what is going on.

First and all, the card is compared to gf114, we already know that. Apparently the hotclock on Fermi was replaced with 2x as many CUDA cores. That means, that if Fermi did not have hotclocks like Kepler, the amount of cores on would not be 384, but 768. GTX 680 then obviously doubles this number once more, so theoretically it should be 2x as powerful as gf114.

Point no. 2: we used to supposed that gf104/114 weakness in computing compared to gf110 was down to asymetric number of warp-scheduler (which feed the shaders with data) to numbers of blocks of CUDA cores per cluster, or whatever is it called. More specifically it was 2 warp schedulers to 3 blocks, one of those blocks was then fed with data on irregular basis. Now with GK104, if i understood that correctly, is built on the same principle, only that now its 4 warp schedulers to 6 cuda core blocks. Same ratio, though.

Finally, in quest for biggest efficiency as gaming chip, Nvidia stripped another feature of the GPU, important to gpgpu, some kind of hardware scheduler. There is some kind of "static scheduling" place, whatever that means. I am still hopeful, this does not have to mean to Octane that much, we shall see.

Bottom line, i wont buy Kepler, not at least until there is some Octane benchmark and some info on "big Kepler". If you look at those CUDA core numbers, its basically like 768 cores of Fermi, but if it can use only 2/3 of it most of the times, well, thats like using 512 Fermi cores. Therefore, gtx580, just on higher frequency. Not enough for me.
Intel Core i7 980x @ 3,78GHz - Gigabyte X58A UD7 rev 1.0 - 24GB DDR3 RAM - Gainward GTX590 3GB @ 700/1400/3900 Mhz- 2x Intel X25-M G2 80GB SSD - WD Caviar Black 2TB - WD Caviar Green 2TB - Fractal Design Define R2 - Win7 64bit - Octane 2.57
nrygpu
Licensed Customer
Posts: 66
Joined: Fri Feb 17, 2012 4:14 pm

The GTX 680 is only $499 so compared to the cost of the GTX 580 it is a great deal. Of course if it does not work with Octane yet then that is a major drawback to users in this forum. Mine are sitting in their boxes from EVGA patiently waiting for the day I can run some Octane renders with them! :-) Can't wait!!! They should be at least as fast as the GTX 580. Why would NVIDIA work hard for three years to release a GPU that is only as good as the last model. That does not make any sense at all.
Timmaigh
Licensed Customer
Posts: 168
Joined: Mon Nov 01, 2010 9:52 pm

nrygpu wrote:The GTX 680 is only $499 so compared to the cost of the GTX 580 it is a great deal. Of course if it does not work with Octane yet then that is a major drawback to users in this forum. Mine are sitting in their boxes from EVGA patiently waiting for the day I can run some Octane renders with them! :-) Can't wait!!! They should be at least as fast as the GTX 580. Why would NVIDIA work hard for three years to release a GPU that is only as good as the last model. That does not make any sense at all.
Well, obviously buying gtx580 now would not make sense, i was more thinking about not selling them yet, if you already have those.
Intel Core i7 980x @ 3,78GHz - Gigabyte X58A UD7 rev 1.0 - 24GB DDR3 RAM - Gainward GTX590 3GB @ 700/1400/3900 Mhz- 2x Intel X25-M G2 80GB SSD - WD Caviar Black 2TB - WD Caviar Green 2TB - Fractal Design Define R2 - Win7 64bit - Octane 2.57
nrygpu
Licensed Customer
Posts: 66
Joined: Fri Feb 17, 2012 4:14 pm

Yes, I would not sell your GTX 580 cards until the GTX 680's work with Octane. I will let you know once they work but as of right now they do not render at all due the version of CUDA that Octane is written in. This should be fixed in the next release is what it sounds like.
User avatar
pfrancke
Licensed Customer
Posts: 47
Joined: Sun May 30, 2010 2:45 am
Location: West Virginia

Currently I'm running a GTX 295, and with the 680s soon being available I am now in the market for a new video card. The cheapest 3GB GTX 580s I've found are now selling for $449, $459, $479. I think I need to wait at least 3 more weeks to see if prices drop further - And to see if I should get a 2GB GTX 680 instead! Shopping every day...
Windows 7, i7-3930K @3.2, 32GB ram, two 3GB GeForce GTX 580
tehfailsafe
Licensed Customer
Posts: 229
Joined: Sun Nov 07, 2010 9:27 pm

nrygpu wrote:Why would NVIDIA work hard for three years to release a GPU that is only as good as the last model. That does not make any sense at all.

Exactly. Which leads me to think that what we're seeing as the GK104 680 wasn't originally intended to be their "top of the line" card, and probably was going to be the next iteration of the 60s, so a 660. But after AMD released their newest top of the line cards, Nvidia realized their GK104 660 already beats them ( in gaming performance FPS benchmarks, etc ) so why sell the GK104 660 at $250 and the GK110 680 at $500 for a product launch when they could instead launch the GK104 as a 680 at $500 and delay for a few months to release the GK110 as 685 as a flagship at even higher?

And in the end they get two separate product launches where their card is the top of the current market = more money.
windows 7 64 bit| GTX580 1.5Gb x2 | Intel 2600k @ 4.9 | 16gb ddr3 | 3ds max 2012
Shlish
Licensed Customer
Posts: 3
Joined: Mon Nov 21, 2011 4:49 am

I can confirm that octane will not work with the new GTX 680's just tested a couple out in my machine. Really nice performance in 3Ds Max, zero output in Octane. We will just have to wait for an Octane update.
Timmaigh
Licensed Customer
Posts: 168
Joined: Mon Nov 01, 2010 9:52 pm

dual-GPU GTX 690 is out.... and apparently, its cost is 999 USD :shock:
If true, Nvidia went full retard here, i bought my gtx590 exactly one year ago for 610 Euros... now its successor should cost 1000? And technically, not even successor, as its probably not the high-end chip? How much will cost that one then, 2000? Mightily dissapointing...

BTW when will we see optimised build for Kepler cards to finally see, whether they are worth to buy or not?
Intel Core i7 980x @ 3,78GHz - Gigabyte X58A UD7 rev 1.0 - 24GB DDR3 RAM - Gainward GTX590 3GB @ 700/1400/3900 Mhz- 2x Intel X25-M G2 80GB SSD - WD Caviar Black 2TB - WD Caviar Green 2TB - Fractal Design Define R2 - Win7 64bit - Octane 2.57
User avatar
matej
Licensed Customer
Posts: 2083
Joined: Fri Jun 25, 2010 7:54 pm
Location: Slovenia

But it haz thousands moar of CUDA cores!! I believe 3-thousand?... That's like six 580's 8-)
SW: Octane 3.05 | Linux Mint 18.1 64bit | Blender 2.78 HW: EVGA GTX 1070 | i5 2500K | 16GB RAM Drivers: 375.26
cgmo.net
Post Reply

Return to “Off Topic Forum”