Sweeeettt...
i can't wait for t_3 to come here with his experience!!
Nildo
New gtx 680
Forum rules
For new users: this forum is moderated. Your first post will appear only after it has been reviewed by a moderator, so it will not show up immediately.
This is necessary to avoid this forum being flooded by spam.
For new users: this forum is moderated. Your first post will appear only after it has been reviewed by a moderator, so it will not show up immediately.
This is necessary to avoid this forum being flooded by spam.
Corei7, 6x GTX1070 (2 Inside and 4 in a CUBIX Xpander), 32GB Ram, Win10 64Bit Home, 3dsMax2017 64Bit, Octane for max 3 and LOVING IT!!!!
MSI GT72S 6QE Dominator Pro G win10Home, gtx 980m.
MSI GT72S 6QE Dominator Pro G win10Home, gtx 980m.
Hmm, I'm curious as to what the performance is like! I have a 3gb 580 at the moment, direct lighting on the demo is good speed considering the visual quality, but path tracing / mlt just isn't usable in the timeframes I work with. I wonder what difference a 680 would make, either instead of my 580 or paired with it! It may be an incentive for me to buy octane 

sadly no "real" numbers for now. octane detects the card but doesn't render; the gpu just fails in octane without shooting a single ray... so i guess we need to wait for a new octane build
in the meanwhile some more synthethic benchmarks (sisoft sandra 2012, raw cuda sp shader performance):
cool thing -literally- are the temps - even after running some benchmark cycles the card didn't get past 70°C with the standard fan rpm curve (maxing out at around 70%) and is quit silent at the same time. looks like these will stack up nicely
think i will keep it, even if the wait for a new octane build may last a little...
... so what about a nightly octane build?
just kidding
ps: i have two gtx 570 2.5gb for sale, drop me a pm if interested

in the meanwhile some more synthethic benchmarks (sisoft sandra 2012, raw cuda sp shader performance):
Code: Select all
gtx 560 ti @ default: 1065mpix/sec
gtx 570 @ default: 1520mpix/sec
gtx 580 @ default: 1680mpix/sec
gtx 580 @ 850mhz: 1850mpix/sec (also vram oc)
gtx 680 @ default: 2750mpix/sec = 1.64 times a gtx 580 or 2.58 times a gtx 560 ti at default clocks
gtx 680 + 300mhz: 3250mpix/sec (+15%)

... so what about a nightly octane build?
just kidding

ps: i have two gtx 570 2.5gb for sale, drop me a pm if interested

„The obvious is that which is never seen until someone expresses it simply ‟
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
- michaelkdaw
- Posts: 14
- Joined: Sun Feb 05, 2012 1:50 am
Thanks for posting the numbers. They provide a bit more optimism after seeing the OpenCL results.
You haven't had any problems running it at a 165 mHz overclock?
I hope you get a chance to use it in Octane soon...
You haven't had any problems running it at a 165 mHz overclock?
I hope you get a chance to use it in Octane soon...
at least not while the rather short benchmark cycles. but expect to see this change, if pushing it with octane. but if it is really that powerful *praying* it won't be necessary to oc anyway...michaelkdaw wrote:You haven't had any problems running it at a 165 mHz overclock?
„The obvious is that which is never seen until someone expresses it simply ‟
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
- Jaberwocky
- Posts: 976
- Joined: Tue Sep 07, 2010 3:03 pm
T_3
have you downloaded the latest drivers from the Nvidia web site?
I think they just came out at launch.Maybe that might help get Octane moving.
If that doesn't help, then the radical change in architecture on the chip is probably responsible for Octane not working.
Perhaps Radiance and the team need to invest in a GTX680 for their office Test bed before they release the next version.
have you downloaded the latest drivers from the Nvidia web site?
I think they just came out at launch.Maybe that might help get Octane moving.
If that doesn't help, then the radical change in architecture on the chip is probably responsible for Octane not working.
Perhaps Radiance and the team need to invest in a GTX680 for their office Test bed before they release the next version.
CPU:-AMD 1055T 6 core, Motherboard:-Gigabyte 990FXA-UD3 AM3+, Gigabyte GTX 460-1GB, RAM:-8GB Kingston hyper X Genesis DDR3 1600Mhz D/Ch, Hard Disk:-500GB samsung F3 , OS:-Win7 64bit
hey, thanks for the numbers, can you try to render with iRay from Max, or RT from Vray, i think if you can benchmark using those renders, the resoults must be the same, they both use brute force to compute. thankst_3 wrote:at least not while the rather short benchmark cycles. but expect to see this change, if pushing it with octane. but if it is really that powerful *praying* it won't be necessary to oc anyway...michaelkdaw wrote:You haven't had any problems running it at a 165 mHz overclock?
Core i7 860, ZOTAC GTX 580 - 3GB, 2 x GTX TITAN, 32Gb RAM
yep 301.something, dating 22.03. most probably there is a new cuda build necessary. would be interesting to have a small or even tiny statement from behind the newzealandian wall of silence, but i wouldn't mind if they feel better need to tell us about instancing and the likes firstJaberwocky wrote:T_3 have you downloaded the latest drivers from the Nvidia web site?

sorry have no access to it. but i guess they will need another build based on an updated cuda sdk too?lixai wrote:hey, thanks for the numbers, can you try to render with iRay from Max, or RT from Vray, i think if you can benchmark using those renders, the resoults must be the same, they both use brute force to compute. thanks
„The obvious is that which is never seen until someone expresses it simply ‟
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
t_3 you can download 3ds max trial version if you want, it comes with iRay.
http://usa.autodesk.com/3ds-max/trial/
http://usa.autodesk.com/3ds-max/trial/
Core i7 860, ZOTAC GTX 580 - 3GB, 2 x GTX TITAN, 32Gb RAM
i'm really sorry, but i can't do thatlixai wrote:t_3 you can download 3ds max trial version if you want, it comes with iRay.
http://usa.autodesk.com/3ds-max/trial/


just kidding; i'll see, what i can do...
„The obvious is that which is never seen until someone expresses it simply ‟
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400
1x i7 2600K @5.0 (Asrock Z77), 16GB, 2x Asus GTX Titan 6GB @1200/3100/6200
2x i7 2600K @4.5 (P8Z68 -V P), 12GB, 1x EVGA GTX 580 3GB @0900/2200/4400