
New gtx 680
Forum rules
For new users: this forum is moderated. Your first post will appear only after it has been reviewed by a moderator, so it will not show up immediately.
This is necessary to avoid this forum being flooded by spam.
For new users: this forum is moderated. Your first post will appear only after it has been reviewed by a moderator, so it will not show up immediately.
This is necessary to avoid this forum being flooded by spam.
CUDA error on device 0, cannot use this device for rendering - and that's itlixai wrote:t_3 you can download 3ds max trial version if you want, it comes with iRay.
http://usa.autodesk.com/3ds-max/trial/

it was just moved to Public Forums » Resources and Sharing because it would fit better. but wasn't aware, that this is not really public - moved it back, thanks...SamAnona wrote:Can't wait for some Octane benchmarks!btw. how come this thread was moved to a non-public section of the forums?
„The obvious is that which is never seen until someone expresses it simply ‟
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
As a citizen I wish to distance myself from this foreign companyfrom behind the newzealandian wall of silence

Seriously though, I am actually hoping they send someone out to do the business management role.
Or perhaps I should just move to Auckland. I can get paid for my time then.

Nice benchmarks. I see a GTX680 is about 3x faster than a GTX460, so the GTX660 ought to be 2x faster.
So graphics card performance doubles in 2 years....
In 2014 people with 8x GTX880 will be rendering the same scene I wait 24 mins for in 30 secs.
I guess speeds will top out like they did for CPU...maybe after another couple of generations...possibly down to 7.5 secs which is 3mins rendering per sec of animation. If you are super serious with 3 such rigs, 1 min

Imagine that though 7.5 secs! hardly enough time for 2 slurps of coffee


Last edited by pixelrush on Fri Mar 23, 2012 8:19 pm, edited 1 time in total.
i7-3820 @4.3Ghz | 24gb | Win7pro-64
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
because there isn't anything else i could do with the card, i tried +300 - resulting in another 5% more; means +15% or a half gtx 560 more power than on stock clockmichaelkdaw wrote:Thanks for posting the numbers. They provide a bit more optimism after seeing the OpenCL results.
You haven't had any problems running it at a 165 mHz overclock?
I hope you get a chance to use it in Octane soon...

at +400 it dropped out...
„The obvious is that which is never seen until someone expresses it simply ‟
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
- michaelkdaw
- Posts: 14
- Joined: Sun Feb 05, 2012 1:50 am
An extra 300 mHz is a lot. That's encouraging.
It's funny... I almost never play games anymore, but I've never more felt a need to get the latest graphics card hardware.
But not till I hear a bit from Refractive regarding this issue.
It's funny... I almost never play games anymore, but I've never more felt a need to get the latest graphics card hardware.
But not till I hear a bit from Refractive regarding this issue.
- Kevin Sanderson
- Posts: 40
- Joined: Thu Jan 28, 2010 10:34 pm
Seeing the DirectX11 compute shader fluid simulation results, maybe it will scream when everything is updated.
http://www.anandtech.com/show/5699/nvid ... -review/17
http://www.anandtech.com/show/5699/nvid ... -review/17
some news: found this http://developer.nvidia.com/sites/defau ... /glass.exe (thanks to the 3dcenter.org forum)
it renders a scene with 3 glasses using cuda based raytracing, thus should be at least a bit more "real world":
test was glass.exe -d=3840x2560 -B=15x60
(means image of 3840x2560px dimension, 15sec warmup, 60 frames rendered, no display)
driver version 301.10
somewhat less promising, still hard to judge, as this thing seems to be very old...
it renders a scene with 3 glasses using cuda based raytracing, thus should be at least a bit more "real world":
test was glass.exe -d=3840x2560 -B=15x60
(means image of 3840x2560px dimension, 15sec warmup, 60 frames rendered, no display)
driver version 301.10
Code: Select all
gtx 560 ti @ 822: 1.29272 fps / 46.4139 sec
gtx 570 @ 732: 1.91363 fps / 31.3541 sec
gtx 580 @ 772: 2.28240 fps / 26.2881 sec
gtx 680 @ std: 2.38455 fps / 25.1625 sec
oc'ed:
gtx 580 @ 850: 2.48827 fps / 24.1132 sec
gtx 680 + 300: 2.81746 fps / 21.2958 sec
„The obvious is that which is never seen until someone expresses it simply ‟
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
I just got 2 GTX 680's and installed them but for some reason they are not working with Octane Standalone and Octane plugin for Max. The render progress window does not go anywhere and the GPU's are not being used at all. I am monitoring them with the EVGA Precsion X that came with the cards. I have NVIDIA Graphics Driver 300.83 installed which is what came with the GPU's.
In Octane Standalone it shows that both cuda devices failed. Is this maybe because these are CUDA 3.0 devices and the 580's were 2.0???
I was super excited to try these out. Hope to get them working soon so I can report the performance results with Octane. Has anyone else successfully rendered with Octane on GTX 680's??
Thanks!
In Octane Standalone it shows that both cuda devices failed. Is this maybe because these are CUDA 3.0 devices and the 580's were 2.0???
I was super excited to try these out. Hope to get them working soon so I can report the performance results with Octane. Has anyone else successfully rendered with Octane on GTX 680's??
Thanks!
- mib2berlin
- Posts: 1194
- Joined: Wed Jan 27, 2010 7:18 pm
- Location: Germany
Hi, it is not possible atm., octane needs to rebuild against Cuda Toolkit 4.2 which is in Release Candidate status.Has anyone else successfully rendered with Octane on GTX 680's??
I think the next version (beta 2.6) is build with Cuda Toolkit 4.2.
Cheers, mib.
Opensuse Leap 42.3/64 i5-3570K 16 GB
GTX 760 4 GB Driver: 430.31
Octane 3.08 Blender Octane
GTX 760 4 GB Driver: 430.31
Octane 3.08 Blender Octane