ICCP 2024

7/22 (Mon) ~ 7/24 (Wed), 2024 at SG1 auditorium in EPFL in Lausanne, Switzerland

  • ICCP 2024 Logo

The International Conference on Computational Photography is the premier venue for research works broadly on computational photography. Computational photography is a vibrant area at the intersection of optics, imaging, sensors, signal processing, computer vision, and computer graphics. It seeks to create new photographic and imaging functionalities and experiences that go beyond the possibilities of exclusive, disparate camera and image processing technologies. This cross-disciplinary field further seeks a better understanding of imaging models, perception, and limits, through holistic analysis.

We look forward to this year's exciting sponsorship and demonstration opportunities, featuring a variety of ways to connect with participants in person. Sony will participate as a Platinum Level sponsor.

Technology Demonstrations at Sony Demo Table

We will have technology demonstrations at Sony demo table (in SG 094.11). Visit us to explore our latest technology firsthand and engage with our team.

Location: SG 094.11
Sony Demonstration Hours:
- Monday, July 22 | 15:00 - 17:30
- Tuesday, July 23 | 15:00 - 17:30

Applications of RGB-EVS Hybrid Sensors for Imaging

< Demonstrator > Saeed Rad, Saad Himmi

Traditional frame-based cameras often suffer from visual blur and loss of interframe information due to slow shutter speeds and long exposure times. To address this challenge, we utilize a hybrid sensor capable of simultaneously capturing RGB frames and events. We demonstrate how the high temporal resolution of events mitigates motion blur and improves intermediate frame prediction.

NeISF: Neural Incident Stokes Field for Geometry and Material Estimation

< Demonstrator > Takeshi Uemori

Multi-view inverse rendering is the problem of estimating the scene parameters such as shapes, materials, or illuminations from a sequence of images captured under different viewpoints. Many approaches, however, assume single light bounce and thus fail to recover challenging scenarios like inter-reflections. On the other hand, simply extending those methods to consider multi-bounced light requires more assumptions to alleviate the ambiguity. To address this problem, we propose Neural Incident Stokes Fields (NeISF), a multi-view inverse rendering framework that reduces ambiguities using polarization cues. The primary motivation for using polarization cues is that it is the accumulation of multi-bounced light, providing rich information about geometry and material. Based on this knowledge, the proposed incident Stokes field efficiently models the accumulated polarization effect with the aid of an original physically-based differentiable polarimetric renderer. Lastly, experimental results show that our method outperforms the existing works in synthetic and real scenarios.

< Link >
- NeISF

Multi-Aperture Computed Tomography Imaging Spectrometer (MACTIS)

< Demonstrator > Alexander Gatto, Shan Lin

The computed tomography imaging spectrometer (CTIS) is a snapshot capable hyperspectral camera. A diffractive optical element is used to create multiple projections of the hyperspectral data cube side by side on the image sensor. A reconstruction algorithm computes the hyperspectral image from the spatio-spectral multiplexed signal. It solves a similar problem as the reconstruction algorithms used for computed tomography scanners. We present how such a system can be realized by a parallelized approach. Several apertures are placed next to each other. Each aperture creates only one projection using a grating prism.

Recruiting Information

We look forward to working with highly motivated individuals to fill the world with emotion and to pioneer future innovation through dreams and curiosity. If interested, please access to our career site and/or consider visiting the Sony demo table at SG 094.11 to know more about Sony Group.

Career Site:
https://sonyglobal.wd1.myworkdayjobs.com/en-US/SonyJapanCareers