Octane4 and the nearing PowerVR path tracing ASIC revolution
Thanks for news! These are really interesting.
- TeemuSoilamo
- Posts: 8
- Joined: Mon Jan 16, 2017 2:13 pm
- Contact:
Is the joint presentation with PowerVR at Siggraph still on? Did getting dropped by Apple diminish the prospect of them bringing the ray tracing chip to market at all?
From this recent talk, it sounds like you guys are still working with them:
https://www.youtube.com/watch?v=n9oILsWlW1U#t=25m3s
From this recent talk, it sounds like you guys are still working with them:
https://www.youtube.com/watch?v=n9oILsWlW1U#t=25m3s
We are still working with them. I have no specific insight into what is going on with Apple.TeemuSoilamo wrote:Is the joint presentation with PowerVR at Siggraph still on? Did getting dropped by Apple diminish the prospect of them bringing the ray tracing chip to market at all?
From this recent talk, it sounds like you guys are still working with them:
https://www.youtube.com/watch?v=n9oILsWlW1U#t=25m3s
I can confirm that the ASICs work as advertized in our own testing, and we have been sharing these results everywhere we can to push for this to come to market as soon as possible.
- TeemuSoilamo
- Posts: 8
- Joined: Mon Jan 16, 2017 2:13 pm
- Contact:
Glad to hear that! Hopefully some of the forums where you will be sharing this work will be public ones.Goldorak wrote:We are still working with them. I have no specific insight into what is going on with Apple.
I can confirm that the ASICs work as advertized in our own testing, and we have been sharing these results everywhere we can to push for this to come to market as soon as possible.
- kacperspala
- Posts: 50
- Joined: Thu Jul 29, 2010 2:06 pm
Any idea about the pricetag of this ASIC's ? Dunno if i want to upgrade to 2080 Ti / Volta or wait for the Octane 4/PvR...
ryzer 2700x
rtx 2080Ti
rtx 2080Ti
No idea about price - but in theory, if used in a high volume GPU run, would not add much more to cost. That being said, I don't know when PVR would or could push this to a wider market. All we have done is get a test build of V4 working on their prototype 2 Watt and 10 Watt chips to confirm performance.kacperspala wrote:Any idea about the pricetag of this ASIC's ? Dunno if i want to upgrade to 2080 Ti / Volta or wait for the Octane 4/PvR...
- TeemuSoilamo
- Posts: 8
- Joined: Mon Jan 16, 2017 2:13 pm
- Contact:
So, no joint presentation at SIGGRAPH? Unfortunately, it seems like PVR has all but abandoned this project.
Perhaps a more viable way toward real-time ray tracing is via AI denoising, like NVIDIA is doing:

https://venturebeat.com/2017/07/31/nvid ... tists-can/
Perhaps a more viable way toward real-time ray tracing is via AI denoising, like NVIDIA is doing:

https://venturebeat.com/2017/07/31/nvid ... tists-can/
Both our Unity and NVIDIA talks at siggraph this week featured real world results of our latest work on RT ASIC rendering and Octane AI denoising.
If you missed them and are interested in either topic I'd check it the FB group posts made since April. a
If you missed them and are interested in either topic I'd check it the FB group posts made since April. a
- TeemuSoilamo
- Posts: 8
- Joined: Mon Jan 16, 2017 2:13 pm
- Contact:
Thanks!Goldorak wrote:Both our Unity and NVIDIA talks at siggraph this week featured real world results of our latest work on RT ASIC rendering and Octane AI denoising.
If you missed them and are interested in either topic I'd check it the FB group posts made since April. a
I especially like this quote: http://on-demand.gputechconf.com/siggra ... aming.html:
I really hope you're urging NVIDIA and AMD to work on this--it makes too much sense not to do it after a certain value threshold has been crossed vs rasterization, which it now seems to have been. David Kirk, NVIDIA's ex-chief scientist, was totally opposed to algorithm-specific hardware like raytracing engines, if memory serves. Maybe that's why he's an 'ex-chief scientist'?"Between AI acceleration ASICS and ray tracing ASICS, mixed w/ traditional GPU and CPU compute power, there's no doubt in my mind that in the next 3-5 years you'll see a single SoC that's able to deliver nearly unbiased, noise-free rendering on the cloud or even locally." [10:38]
