David Drascic

  • Consulting in usability issues for software, hardware and web site design
  • Applying knowledge and principles of ergonomics creatively to fit the task to the worker, and not the other way around
  • Performing scientific research with an engineering, psychology, and computing background
  • Conducting applied research and usability studies for industry
  • Ph.D candidate in Human Factors / Ergonomics
    ETC-Lab: Ergonomics in Teleoperation and Control Lab
    Mechanical and Industrial Engineering
    University of Toronto.
    For my dissertation, I'm investigating depth perception, stereoscopic displays, Virtual Reality, Augmented Reality, and teleoperation.

  • Contents

    • Research
      • The Ergonomics of Teleoperation
      • Comparing Monoscopic and Stereoscopic Video
      • Augmenting Reality: Advanced Sterescopic Displays
      • Inventing ARGOS, One of the first Augmented Reality Systems
    • Papers
    E:Mail: david.drascic@utoronto.ca

    My Curriculum Vitae (CV) is on-line.


    My field is Human Factors, which is the study of people in the work place (every work place), with the goal of redesigning tasks and equipment to better suit the range of individuals performing them. The field is very broad (which is why I like it), and includes many aspects of design engineering, perceptual psychology, anthropometrics, cognitive psychology, kinesiology, social psychology, and medicine.

    My research has focussed on the benefits and usability issues of stereoscopic displays, particularly with regards to teleoperation and augmented reality displays. With my colleagues Paul Milgram and Julius Grodski, I've investigated the usefulness of stereoscopic video for bomb disposal teleoperation, developed one of the first working Augmented Reality systems to improve the human-machine interface for this task, and have conducted various investigations in the usability and perceptual issues of Augmented Reality.

    Comparing Monoscopic and Stereoscopic Video
    My graduate work started with the goal of improving the human-machine interface for bomb-disposal teleoperation, and subsequently followed two paths: improving the display for telerobotics by using stereoscopic video, and improving the user interface through the application of Augmented Reality.

    My Master's thesis examined on the costs and benefits of using stereoscopic displays for teleoperation. As a quick summary of my findings, I would say:

    • For some tasks, like teleoperation, that require precise positioning of objects in 3-D space, stereoscopic displays are a great idea. They are easier to learn than monoscopic displays, and allow faster performance with fewer errors. [HFS Conference 1991]
    • For other tasks they may not be worth it. It really depends on the task and on the display being considered. [Master's Thesis, DND Workshop 1993]
    • Tasks that are highly repetitious won't benefit as much from stereo displays in the long run as will those that are always unique, such as bomb disposal and hazardous materials cleanup.
    • Displays in which the camera views are restricted to a single position, and those in which a significant proportion of the motion of the system is perpendicular to the display will benefit most greatly from stereoscopic displays.
    • Tasks that involve only computer graphics without any stereoscopic video can often be accomplished better by redesigning the task than by using stereoscopic displays. Stereoscopic perception works best in fully rendered and richly textured environments like the real world. Few computer-based displays can afford the desired degree of realism.

    Augmenting Reality
    Augmented Reality combines a view of the real world with computer enhancements, or augmentations, so as to make the task at hand easier to do. Some people do this by using see-through head-mounted displays, so that you see computer graphics floating around in the real world. Because we are interested in teleoperation, our view of the real world is provided by a stereoscopic video system, and our operators view the remote site (along with the stereoscopic graphic augmentations) on a monitor.

    Unlike Virtual Reality, which seeks to create an entirely artificial world, the goal of Augmented Reality is to present to the user the real world (possibly at a remote location, such as underwater or space, or of a different scale, such as micro-surgery) that is enhanced (or augmented ) with computer graphics. The applications of this technology include teleoperation (using a robot to do a job at a distance using remote control), medical and architectural imaging, and many other areas.

    In the ETC-Lab, we combine stereoscopic video with carefully calibrated stereoscopic computer graphics. At the moment, I am investigating the perceptual aspects of Augmented Reality displays. In particular, I am examining the effects of accommodation-vergence conflict and accommodation mismatch on the perception of depth in stereoscopic displays.

    Inventing ARGOS
    Working with Prof. Paul Milgram and Dr. Julius Grodski, I pioneered the ETC-Lab's work in Augmented Reality, doing the research, engineering, computer programming, and electronics work that culminated in our patented technology which we call ARGOS (Augmented Reality through Graphic Overlays on Stereo-video). One of the first useful examples of Augmented Reality was the virtual pointer, which consisted of a calibrated stereoscopic graphic pointer superimposed on a painstakingly designed stereoscopic video system. The pointer can be moved freely around in the remote view, and used to measure distances and locations of objects in the real world, and can pass this information on to the computers controlling the robots. [SPIE Stereoscopic Displays 1991, IROS 1993]

    Most of my effort since developing this technology has been devoted to the usability issues of stereoscopic displays, and the perceptual aspects of Augmented Reality displays. In particular, I am trying to measure the perceptual distortions created by stereoscopic displays.


    This is a list of some of my papers that are on-line. The rest are either on their way, or have been superceded by later papers. A separate Bibliography contains a complete list of all of my publications, with full bibliographic references.

    1. Perceptual Effects in Aligning Virtual and Real Objects in Augmented Reality Displays
      (HFES 1997)

    2. Perceptual Issues in Augmented Reality
      (SPIE SD&A 1996)

    3. Telerobotic Control with Stereoscopic Augmented Reality
      (SPIE SD&A 1996)

    4. Merging Real and Virtual Worlds
      (IMAGINA 1995)

    5. Stereoscopic Vision and Augmented Reality
      (Scientific Computer and Automation 1993)

    6. Applications of Augmented Reality for Human-Robot Communication
      (IROS 1993)

    7. An Evaluation of Four 6 Degree-of-Freedom Input Techniques
      (ACM InterCHI 1993)

    8. Defence Teleoperation and Stereoscopic Video
      (SPIE SD&A 1993)

    9. Virtual Telerobotic Control
      (DND Advance Technologies Workshop 1993)

    10. ARGOS: A Display System for Augmenting Reality
      (ACM SIGGRAPH Technical Video Review 1993)

    11. Stereoscopic Video-Graphic Coordinate Specification System
      (USA Patent 5,175,616, 1992)

    12. Skill Acquisition and Task Performance in Teleoperation Using Monoscopic and Stereoscopic Video Remote Viewing (won the Best Student Paper award at HFS 1991)

    13. An Investigation of Monoscopic and Stereoscopic Video for Teleoperation
      (MASc thesis)

    14. Positioning Accuracy of a Virtual Stereographic Pointer in a Real Stereoscopic Video World
      (SPIE SD&A 1991)

    15. Learning Effects in Telemanipulation with Monoscopic versus Stereoscopic Remote Viewing
      (IEEE SMC Conference 1989)

    16. A short segment of a talk I gave to the IICS (International Interactive Communications Society) was broadcast around the world on CBC Radio's As It Happens.

    Last Update: Thursday 6 February 2003
    This page is www.drascic.net