Imatest has a nice documentation on digital image quality.
Can GigE Vision deliver on its promise?, a white paper from SONY, reveals the problem of GigE Vision.
A high speed tri-vision system for automotive applications, a nice comparison of latency and jitter of different camera interface. Its criticism on GigE Vision is citing the Sony white paper.
A high resolution smart camera with GigE Vision extension for surveillance applications, also citing the Sony paper, but it uses the Sony paper to support their claim that GigE Vision is the choice. How can people say GigE Vision is good by citing a paper with conclusion that GigE Vision is bad? This paper does not even give a word to defend the critisism on GigE Vision from the Sony paper.
GigE Vision – CPU Load and Latency, a white paper from BASLER, claiming GigE Vision is low latency and low jitter. Its argument is not strong enough.
Latency and determinism in GigE Vision systems.
The Fifth International Workshop on Automatic Performance Tuning (iWAPT) 2010. Deadline on April 19th, 2010.
WikiCFP: A Wiki for Calls For Papers, with handy RSS feed.
GTX 400 Graphics Architecture Whitepaper, the graphics aspects of Fermi.
The proceeding of ISSCC 2010 is released. Highlights are “session 5 — processors” and “session 18 — Power Efficient Media Processing”. The highlight of the highlight is “A graphics and vision unified processor with 0.89µW/fps pose estimation engine for augmented reality” from KAIST, a reconfigurable, unified architecture for vision, graphics, and control.
Eye-Tracking Research & Applications (ETRA) 2010.
The signal processing magazine has a special issue on Signal Processing on Platforms with Multiple Cores.
The control system magazine has a special issue on R.E. Kalman and his filter.
Nallatech In-Socket FPGA Front-Side Bus Accelerator, tightly coupled accelerator using FSB. FPGA is used as an example, but it also applies to GPU.
Remapping the world in 3D, how Google Earth works.
A Comparative Study of Mobile-Based Landmark Recognition Techniques, really comprehensive.
Accurate Predictive Interconnect Modeling for System-Level Design, in TVLSI’10, with delay, power, and area model.
Robust Bioinspired Architecture for Optical-Flow Computation, in TVLSI’10, built for real time.
Recent issue of IEEE Transactions Pattern Analysis and Machine Intelligence has some interesting papers:
An old web page, updated till 2007, on CMOS Digital Image Sensors.
Book Flipping Scanning, in UIST’09, by Yoshihiro Watanabe, provides more details on the 200-page-per-minute demo. The texture mapping in the book flipping scanning is the same as in the graphics pipeline of GPUs. Not a coincident.
AnaFocus provides solutions for Camera System on Chip and Vision System On Chip. The Eye-RIS Cam can reach 8000 fps for 144 x 144 image.
Eutecus, with multi-core visual analytics engine (MVE), an SOC vision architecture implemented on FPGA.
Synfora and AutoESL, two start-ups for system level FPGA design.
Customizable Domain-Specific Computing (CDSC) in UCLA is doing related works.
CNNA 2010, International Workshop on Cellular Nanoscale Networks and Their Applications.
Vertical interactions across ten parallel, stacked representations in the mammalian retina, in Nature 2000, from Werbin Lab of UC Berkeley. It shows how the retina constructs multiple spatio-temporal channels in mammalian vision system. That is different than the Hierarchical Temporal Memory (HTM) model.
Some papers on high speed camera systems:
- 955-fps Real-time Shape Measurement of a Moving/Deforming Object using High-speed Vision for Numerous-point Analysis, by Yoshihiro Watanabe, in ICRA’07.
- Integration of time-sequential range images for reconstruction of a high-resolution 3D shape, by Yoshihiro Watanabe, in ICPR’08.
- A High-Speed Object Tracker from Off-the-Shelf Components, from RTblob, in First IEEE Workshop on Computer Vision for Humanoid Robots in Real Environments at ICCV 2009. Giagabit Ethernet is used.
- CMOS+FPGA vision system for visual feedback of mechanical systems, in ICRA’06. Camera Link is used.
- A new high speed CMOS camera for real-time tracking applications, from TU Graz, in ICRA’04. USB 2.0 is used.
- Open and Reconfigurable System on Chip Architecture with Hardware and Software Preprocessing Capabilities Used for Remote Image Acquisition, in 2008. Gigabit Ethernet is used.
Some links on visual servoing:
A unified approach to visual tracking and servoing, by Ezio Malis and Selim Benhimane. ESM SDK is a real time visual tracking tools written in strict ANSI-C. They are unifying vision and control. It is possible to further unify graphics with vision and control. Augmented Reality (AR) is a targeted application for a unified vision, graphics, control architecture.
A glimpse into CVPR 2010.
Parallel Computing Research at Illinois: The UPCRC Agenda, a view from Illinois.
A Design Pattern Language for Engineering (Parallel) Software: Merging the PLPP and OPL projects, from ParLab patterns, in ParaPLoP 2010.
SVS340 from svs-vistek, with Camera Link interface and more than 1000 fps throughput. It uses KODAK KAI-0340 image sensor. Application notes are available.
National Semiconductor provides SerDes Solutions for their SerDes chips. Development boards (CLINK3V28BT-85, AES-EXP-HPSER-G, etc.) are available. The AES-EXP-HPSER-G can directly connect to AES-SP3ADSP-DVCI-G, an Spartan3+DeVinci development board, with working examples.
Fedora Electronic Lab (FEL), backed by academics and industry. The open source EDA projects, like open circuit design, gEDA, alliance, veripool, etc. are still active.
IPT_ATI_PROJECT, accelerating Image Processing Toolbox of Matlab with ATI and NVIDIA GPUs.
Camera Link to ML50x, and to FMC. Image sensor to FMC.