Todd C. Doehring, Ph.D.
(Graduate, Mechanical and Bioengineering Depts., University of Pittsburgh)
CEO, ABEMIS LLC
tcd@abemis.com or tcdoeh@gmail.com
phone: 215.385.4568
(Graduate, Mechanical and Bioengineering Depts., University of Pittsburgh)
CEO, ABEMIS LLC
tcd@abemis.com or tcdoeh@gmail.com
phone: 215.385.4568
Summary: Mechanical Engineer, Bioengineer, Programmer and Artist/Musician.
Interests and Capabilities: Motorized/automated robotic microscopy, 3-D imaging, 3-D modeling/printing, micro-testing systems, non-linear elasticity/viscoelasticity, Finite Element Analysis, Virtual Reality Environments/Systems.
Following are brief descriptions of ongoing projects.
Interests and Capabilities: Motorized/automated robotic microscopy, 3-D imaging, 3-D modeling/printing, micro-testing systems, non-linear elasticity/viscoelasticity, Finite Element Analysis, Virtual Reality Environments/Systems.
Following are brief descriptions of ongoing projects.
Project 1 : Automated Microscopy Systems
Problem: Automated microscopy systems such as the ones produced by Olympus, Zeiss, Leica/Aperio, or Nikon Inc. are extremely useful for performing large-scale histology, pathology, basic research, and material or implant inspection just to name a few applications. However, these systems are expensive (e.g. $80-250k+) ... a price that is out of reach for smaller rural hospitals, clinics, small businesses, small bioengineering/biology laboratories (like mine), and certainly for enthusiasts. In addition, the imaging and control software is typically closed-proprietary which restricts development of novel applications.
Solution: I have developed affordable automated microscopy systems that combine newly available USB3.0 interfaces, custom stepper motors, innovative mechanical designs, and high-resolution imaging. Shown below are the new MezoScope V.3 and AutoScope V.2 systems. Both have full 3-axis automation with software for mosaic (tiled) imaging and tele-medicine applications. The software for imaging and motor control is provided (and developed) using the .NET framework (C#).
Brackets and parts are built using new metal/ABS 3-D printing technology, significantly reducing (or even eliminating) expensive machining and tooling costs.
- - -
Solution: I have developed affordable automated microscopy systems that combine newly available USB3.0 interfaces, custom stepper motors, innovative mechanical designs, and high-resolution imaging. Shown below are the new MezoScope V.3 and AutoScope V.2 systems. Both have full 3-axis automation with software for mosaic (tiled) imaging and tele-medicine applications. The software for imaging and motor control is provided (and developed) using the .NET framework (C#).
Brackets and parts are built using new metal/ABS 3-D printing technology, significantly reducing (or even eliminating) expensive machining and tooling costs.
- - -
The new MezoScope V.3 system:
Update: The V.3 system has recently been successfully installed at TechCyte, Inc. (Salt Lake City, UT) with the new high-precision stepper motors/controllers and USB3.0 camera. We are achieving up to 100x oil immersion with fast imaging.
Here is a recent blood smear example large-scale (thumb below) image
(NOTE that this is a large image, ~4+Mb jpg reduced 60%).
(NOTE that this is a large image, ~4+Mb jpg reduced 60%).
MezoScope V.3 Features:
* Note: the currently available automated microscopy systems do not have manual control. Our system/design enables both manual and automated imaging; allowing the user to manually/rapidly identify regions or objects (as they would with a standard scope!) and then, with a simple button press, automatically acquire large scale tiles/mosaics of the area of interest.
- Full X-Y-Z automation using inexpensive, accurate micro-stepper motor control
- Switchable manual and full-auto imaging. In manual mode, just like a standard scope!*
- 2x to 100x oil immersion capable with custom illumination including phase contrast and polarization
- Customizable software for novel application development
- Fast, newly available USB3.0 control and imaging modules up to 14Mpix at 30fps
* Note: the currently available automated microscopy systems do not have manual control. Our system/design enables both manual and automated imaging; allowing the user to manually/rapidly identify regions or objects (as they would with a standard scope!) and then, with a simple button press, automatically acquire large scale tiles/mosaics of the area of interest.
See the online GALLERY for many more example images, and the ZOOM gallery for gigabyte sample images.
The AutoScope V.2 System:
The new AutoScope system is a compact, fully automated microscopy system for slide/pathology analyses, material inspection, and a wide range of microscopy applications. Custom imaging/optics are available as well as custom imaging software (.NET).
Features:
Features:
- Compact, customizable X-Y-Z gantry, controlled using micro-stepping (custom) motors and precision lead screws
- Shown above is the histology/pathology (slide mount) configuration, but since we develop and build our systems using 3-D printing processes, a wide variety of custom specimen handling or custom gantries/supports can be rapidly built
- Fast USB3.0 imaging and control, with provided software
- Our open architecture/design allows imaging of large and irregular specimens, as opposed to 'slide scanners' such as the Aperio system(s) that are restricted to slide-only
- Tele-inspection and tele-medicine applications are under development with web-based GUI and control
Further development, research, and investment/partnership:
Our MezoScope and AutoScope systems are currently in third-generation production. Systems are available now, and custom/research configurations are encouraged! Our goal is to provide systems with wide-ranging hardware/software capabilities and options, including customizable optics, lighting, software (OpenCV, etc.) and tele-medicine applications. We are actively looking for investors and researchers who are interested in partnership and/or further development (e.g. SBIR). I believe that our systems have great potential to fill a major gap between low-cost microscopes and the currently very expensive and restrictive high-end automated systems. In addition, our systems have other major advantages; there are few moving parts, thus it is very durable and portable compared to standard automated microscopes. We have multiple lighting designs enabling applications to part, circuit, and material inspection. The frame/gantry is CAD designed and 3-D printed using carbon reinforced ABS material. Thus, it is very light, strong, and inexpensive to produce in custom configurations or low volume compared to standard manufacturing processes (e.g. injection molding).
Please email or phone for any inquiries, including investment or partnership, or for any other questions at all: tcd@abemis.com.
Our MezoScope and AutoScope systems are currently in third-generation production. Systems are available now, and custom/research configurations are encouraged! Our goal is to provide systems with wide-ranging hardware/software capabilities and options, including customizable optics, lighting, software (OpenCV, etc.) and tele-medicine applications. We are actively looking for investors and researchers who are interested in partnership and/or further development (e.g. SBIR). I believe that our systems have great potential to fill a major gap between low-cost microscopes and the currently very expensive and restrictive high-end automated systems. In addition, our systems have other major advantages; there are few moving parts, thus it is very durable and portable compared to standard automated microscopes. We have multiple lighting designs enabling applications to part, circuit, and material inspection. The frame/gantry is CAD designed and 3-D printed using carbon reinforced ABS material. Thus, it is very light, strong, and inexpensive to produce in custom configurations or low volume compared to standard manufacturing processes (e.g. injection molding).
Please email or phone for any inquiries, including investment or partnership, or for any other questions at all: tcd@abemis.com.
Project 2 : Micro 3-D Imaging and Bio-Scaffolds
In development are 3-D microscopy imaging and 3-D modeling tools for imaging/analysis of tissue engineered scaffolds at micro (cellular) and 'mezoscopic' (extra-cellular) scales. Shown below is an example 'slice' of a 3-D confocal image (left), and (right) 3-D reconstruction (note: reconstruction is rotated 180 deg.).
From the images (left) and 3-D reconstruction (right), a variety of topological and biomechanical data can be assessed such as cell/matrix contact surface area, tortuostity, and other parameters.
This example uses confocal images, but images can be used from any source such as the new Mezoscope, fleuroscope, microCT, or others. I use adaptive thresholding and multi-domain automated meshing (with sliver removal) to obtain high fidelity meshes that retain local topologies. Below are related projects/works. Also, new scaffolds developed with 3-D printing technologies (also below) are under development.
This example uses confocal images, but images can be used from any source such as the new Mezoscope, fleuroscope, microCT, or others. I use adaptive thresholding and multi-domain automated meshing (with sliver removal) to obtain high fidelity meshes that retain local topologies. Below are related projects/works. Also, new scaffolds developed with 3-D printing technologies (also below) are under development.
Project 3 : Multi-Domain Automated FE Meshing
Automated mesh generation for complex objects, with a robust sliver removal algorithm.
Problem: Finite Element (FE) analysis is the most powerful and widely used method for physical analysis and design, whether for the automotive, aeronautic, architecture, bioengineering, or almost every industry. Software for FE is a multi-billion dollar industry in itself, with companies such as ANSYS, ABAQUS, NASTRAN, COMSOL, Solidworks, Materialize, Simpleware, and many others providing high quality analysis software. However, even with all of these 'major players', the problem of "sliver-free mesh generation" remains a significant challenge, particularly for complex shapes with multiple domains such as the vertebral body shown below.
Mesh generation is the first step for any FEA. The goal is to convert a solid part design or 3-D image into a 'mesh' that is accurate to the original geometries. Note that these are not surface meshes, but are volumetric, solid meshes. Shown below are some example meshes. A Google search (HERE) shows many meshes/analyses.
The following meshes were generated using my new software (VolMesh).
I have developed a system that directly generates robust tetrahedral meshes from multi-domain volumetric images (such as you get from CT/MRI, or any other 3-D images such as confocal microscopy) with locally optimized mesh density and other controllable parameters. The goal is to provide automated multi-domain mesh generation with high boundary fidelity, and to minimize the major problem of 'slivers'.
Slivers: Despite the many improvements theoretical and programming for mesh generation, 'slivers' remain a challenging problem. Slivers are flat tetrahedron that are an unavoidable result of the Delaunay algorithm for tetrahedral mesh generation (examples here). There has been much research attempting to reduce or remove slivers from FE meshes, with mixed results. An excellent thesis on this topic is the work of Xiangyang Li (PDF here), who describes slivers as below:
A New Approach: Instead of using current theoretical methods, I have developed a 'brute-force' method for sliver removal that works even for multi-domain meshes. Briefly; the method begins by analyzing all slivers in the mesh and nearest face-neighbors. An intelligent algorithm then heuristically categorizes the 'type' of each sliver (either simple or complex), generates 'most convex neighbor manifolds', and reconstructs the local mesh, removing the sliver(s). Slivers at multi-domain boundaries (which often occur) are specially handled to preserve the integrity of the boundary.
In part, this method is made possible by recent advances in computing power and fast/inexpensive memory and GPU-enabled. Even 3 years ago, my method was computationally prohibitive. But now solutions are achievable for even large multi-domain meshes using a 64bit PC (8Gb RAM) within reasonable time (e.g. < 20min for 1M+ mesh). Still, there is a lot more work to do. For example, the algorithm handles most slivers independently, thus it is parallelizable for potentially large speed improvement.
I invite and welcome all inquiries! Please email or phone.
I invite and welcome all inquiries! Please email or phone.
Project 4 : "Meshagons!"
The first ever, 3-D printed, force-optimized Finite Element tetrahedral mesh structures.
I have always been attracted to the artistic beauty of FE meshes (as images and potential sculptures). Recently, I had a thought... what if I could actually physically create these mesh structures? After much work in developing code for generation of controllable meshes that are manifold and suitable for printing, I have successfully developed software that can convert most 3-D objects into manifold printable mesh structures for the first time ever. These structures are not only artistic, but are optimally strong and light, with applications to automotive, aerospace, and related engineering, bioengineering, and architecture just to name a few.
Shown hare are first ever 3-D meshes (printed) of FE structures. The meshes can now be printed in a variety of materials, including ABS, titanium, and stainless steel. There is much more work to be done viz optimization and general coding. The meshing code currently requires a powerful computer such as my 8 processor 16Gb ram system. With this system a meshagon such as those below can be created in approximately four hours computing time. Current work is focused on improving the code (speed improvements) and enabling much larger scale meshes. Please email me for further information (tcdoeh@gmail.com).
Featured in 3Dprinter, 3dprintplan, Fabaloo.com, and other sites!
See my new Sketchfab page for Interactive 3-D viewing of selected models.
Shown hare are first ever 3-D meshes (printed) of FE structures. The meshes can now be printed in a variety of materials, including ABS, titanium, and stainless steel. There is much more work to be done viz optimization and general coding. The meshing code currently requires a powerful computer such as my 8 processor 16Gb ram system. With this system a meshagon such as those below can be created in approximately four hours computing time. Current work is focused on improving the code (speed improvements) and enabling much larger scale meshes. Please email me for further information (tcdoeh@gmail.com).
Featured in 3Dprinter, 3dprintplan, Fabaloo.com, and other sites!
See my new Sketchfab page for Interactive 3-D viewing of selected models.
New: Jewelry Meshagons in rustic or polished bronze, gold, and silver are lovely. Custom, one-of-a-kind designs with wide ranging sizes and shapes are available. I can work with you to create your own unique piece.
Contact me via phone or email for any inquiries.
Contact me via phone or email for any inquiries.
Project 5 : VR-Ultra Ultrasound simulation/training
3-D tools enable training and tele-diagnosis for ultrasound imaging.
Primary, are endovaginal and osteoreconstruction applications, but further applications are capable.
Primary, are endovaginal and osteoreconstruction applications, but further applications are capable.
Problem: Training for endoscopic ultrasound procedures is very difficult, particularly for smaller hospitals in rural areas. Although government mandated, finding persons willing to participate in this training are rare, particularly for endovaginal or esophegeal procedures (due to obvious discomfort).
Solution: Virtual Reality (VR) is ideally suited to provide basic hands-on training without the need for a live subject. Our system uses a Phantom OMNI haptics device (formerly Sensable, Inc., now Geomagic, Inc.) to provide force feedback. This haptic device provides a realistic 'feel' for the procedure. The system is not intended as an 'end solution', but rather as a bridge to teach the fundamentals of wand placement, develop required hand-eye coordination, and improve understanding of the ultrasound images. An important aspect of this system is that the technology can be broadly applied to many VR training and development applications not only for tele-medicine but also for 3-D voxel modeling/training/design in general.
The trainee is guided through various tasks/tests; and because our fully VR system provides both the trainee's motion and force data (as well as feedback), the trainee's performance can be stored and compared with professional technician performance. This is a significant advantage to standard training procedures that cannot 'track' performance but use only subjective evaluations. In addition, the system is designed for tele-training, thus, the professional trainer does not have to be 'on site'.
Competition: A similar training system is available from MedaPhor Inc. (UK based) however they rely on preloaded 'movies' with the wand acting only as a trigger. Our system is very different in that we generate the simulated ultrasound in real-time directly from 3-D voxel MRI data. Thus, we can rapidly introduce various anatomies, disease states, and phantoms for a comprehensive training experience.
Solution: Virtual Reality (VR) is ideally suited to provide basic hands-on training without the need for a live subject. Our system uses a Phantom OMNI haptics device (formerly Sensable, Inc., now Geomagic, Inc.) to provide force feedback. This haptic device provides a realistic 'feel' for the procedure. The system is not intended as an 'end solution', but rather as a bridge to teach the fundamentals of wand placement, develop required hand-eye coordination, and improve understanding of the ultrasound images. An important aspect of this system is that the technology can be broadly applied to many VR training and development applications not only for tele-medicine but also for 3-D voxel modeling/training/design in general.
The trainee is guided through various tasks/tests; and because our fully VR system provides both the trainee's motion and force data (as well as feedback), the trainee's performance can be stored and compared with professional technician performance. This is a significant advantage to standard training procedures that cannot 'track' performance but use only subjective evaluations. In addition, the system is designed for tele-training, thus, the professional trainer does not have to be 'on site'.
Competition: A similar training system is available from MedaPhor Inc. (UK based) however they rely on preloaded 'movies' with the wand acting only as a trigger. Our system is very different in that we generate the simulated ultrasound in real-time directly from 3-D voxel MRI data. Thus, we can rapidly introduce various anatomies, disease states, and phantoms for a comprehensive training experience.
Current development includes extension to esophageal and other ultrasound procedures.
Note that this project/software system is licensed to Net Simulation, Inc.; an Istanbul, Turkey based company, for further development and distribution in Turkey and Eastern Europe regions. Licensing for the US and other regions are available.
Note that this project/software system is licensed to Net Simulation, Inc.; an Istanbul, Turkey based company, for further development and distribution in Turkey and Eastern Europe regions. Licensing for the US and other regions are available.
Applications to tele-diagnosis, where a professional doctor or technician can remotely view, review, and examine ultrasound sessions in real-time are under development.
Final thoughts: These are some of my current projects; including new affordable automated microscopy system(s), micro-testing systems, and improved methods/algorithms for mesh generation and 3-D printing of force-optimized Meshagon structures.
I believe that there are great opportunities for these projects -- not only for investments/grants, but also for further research/development and also enabling exciting new interactive 3-D training tools and teaching curriculi (more updates to come, e.g. Oculus Rift, the newly FREE Unreal Engine, with applications to engineering/architecture; implant and tissue bioengineering, scaffolds, etc.).
For more information, input or interest, feel free to email me at tcdoeh@gmail.com (personal), tcd@abemis.com,
or give me a call at 215-385-4568. Please leave a message, I will get back to you ASAP.
Thank you for reading,
Todd C. Doehring