PBVS 2016 - 12th IEEE Workshop on. Perception Beyond the Visible Spectrum (PBVS)
Topics/Call fo Papers
The objective of this workshop is to highlight cutting edge advances and state-of-the-art work being made in the exponentially growing field of PBVS (previously OTCBVS) along its three main axes: Algorithms, Sensors Processing, and Applications. This field involves deep theoretical research in sub-areas of image processing, machine vision, pattern recognition, machine learning, robotics, and augmented reality within and beyond the visible spectrum. It also presents a suitable framework for building solid advanced vision based systems.
The computer vision community has typically focused mostly on the development of vision algorithms for object detection, tracking, and classification associated with visible range sensors in day and office-like environments. In the last decade, infrared (IR), thermal and other non-visible imaging sensors were used only in special areas like medicine and defense. That lower interest level in infrared imagery was due in part to the high cost of non-visible range sensors, low resolutions, poor image quality, lack of widely available data sets, and lack of consideration of the potential advantages of the non-visible part of the spectrum. These historical objections are becoming less relevant as IR imaging technology is advancing rapidly and the sensor cost is dropping dramatically. Image sensing devices with high dynamic range and high IR sensitivity have started to appear in a growing number of applications ranging from defense and automotive domains to home and office security. In addition, mobile hyperspectral and mm-wave sensors are also coming into existence.
In order to develop robust and accurate vision-based systems that operate in and beyond the visible spectrum, not only existing methods and algorithms originally developed for the visible range should be improved and adapted, but also entirely new algorithms that consider the potential advantages of non-visible ranges are certainly required. The fusion of visible and non-visible ranges, like radar and IR images, or thermal and visible spectrum images as well as acoustic images, is another dimension to explore for higher performance of vision-based systems. The non-visible light is widely employed in night vision-based systems, and many detection and recognition systems available today in the market arerelying on physiological phenomena produced by IR and thermal wavelengths. Using artificially controlled lights is a practical solution to eliminate challenging ambient light effects.
This series of Perception Beyond the Visible Spectrum workshops creates connections between different communities in the machine vision world ranging from public research institutes to private, defense, and federal laboratories. It brings together academic pioneers, industrial and defense researchers and engineers in the field of computer vision, image analysis, pattern recognition, machine learning, signal processing, sensors, and human-computer interaction.
Topics of Interest:
Sensing/Imaging Technologies
IR/EO imaging system
Underwater sensing
Hyperspectral/Satellite imaging
Spectroscopy/Microscopy imaging
LIDAR/LDV sensing
Compressive sensing
RADAR/SAR imaging
RGBD sensing
Applications and Systems
Surveillance and reconnaissance systems
Autonomous vehicles
Autonomous ships
Autonomous grasping
Vision-aided navigation
Night/Shadow vision
Sensing for agriculture and food safety
Vision-based autonomous multi-copter
Theory and Algorithms
Imagery/Video exploitation
Object/Target tracking and recognition
Feature extraction and matching
Activity recognition
Deep learning for perception
Multimodal/Multi-sensor data fusion
Multimodal registration
Video + text fusion
The computer vision community has typically focused mostly on the development of vision algorithms for object detection, tracking, and classification associated with visible range sensors in day and office-like environments. In the last decade, infrared (IR), thermal and other non-visible imaging sensors were used only in special areas like medicine and defense. That lower interest level in infrared imagery was due in part to the high cost of non-visible range sensors, low resolutions, poor image quality, lack of widely available data sets, and lack of consideration of the potential advantages of the non-visible part of the spectrum. These historical objections are becoming less relevant as IR imaging technology is advancing rapidly and the sensor cost is dropping dramatically. Image sensing devices with high dynamic range and high IR sensitivity have started to appear in a growing number of applications ranging from defense and automotive domains to home and office security. In addition, mobile hyperspectral and mm-wave sensors are also coming into existence.
In order to develop robust and accurate vision-based systems that operate in and beyond the visible spectrum, not only existing methods and algorithms originally developed for the visible range should be improved and adapted, but also entirely new algorithms that consider the potential advantages of non-visible ranges are certainly required. The fusion of visible and non-visible ranges, like radar and IR images, or thermal and visible spectrum images as well as acoustic images, is another dimension to explore for higher performance of vision-based systems. The non-visible light is widely employed in night vision-based systems, and many detection and recognition systems available today in the market arerelying on physiological phenomena produced by IR and thermal wavelengths. Using artificially controlled lights is a practical solution to eliminate challenging ambient light effects.
This series of Perception Beyond the Visible Spectrum workshops creates connections between different communities in the machine vision world ranging from public research institutes to private, defense, and federal laboratories. It brings together academic pioneers, industrial and defense researchers and engineers in the field of computer vision, image analysis, pattern recognition, machine learning, signal processing, sensors, and human-computer interaction.
Topics of Interest:
Sensing/Imaging Technologies
IR/EO imaging system
Underwater sensing
Hyperspectral/Satellite imaging
Spectroscopy/Microscopy imaging
LIDAR/LDV sensing
Compressive sensing
RADAR/SAR imaging
RGBD sensing
Applications and Systems
Surveillance and reconnaissance systems
Autonomous vehicles
Autonomous ships
Autonomous grasping
Vision-aided navigation
Night/Shadow vision
Sensing for agriculture and food safety
Vision-based autonomous multi-copter
Theory and Algorithms
Imagery/Video exploitation
Object/Target tracking and recognition
Feature extraction and matching
Activity recognition
Deep learning for perception
Multimodal/Multi-sensor data fusion
Multimodal registration
Video + text fusion
Other CFPs
- 1st Systems and Quality Assessment in Health, Education and Services (SQAHES 2016)
- 1st New Pedagogical Approaches with Technologies (NPAT 2016)
- 2nd Gaming, Simulation and Play (GSP 2016)
- 3rd Applied Statistics and Data Analysis using Computer Science (ASDACS 2016)
- 3rd International Workshop on ICT for Auditing (WICTA 2016)
Last modified: 2016-01-24 16:27:06