Implicit and Learned Models

Robots must learn from experience and data to be efficient in unmodeled, unknown, and previously unseen domains. There are many methods for learning implicit models of the world, which capture everything from a 3D reconstruction of a scene, quantifying the risk of collision, understanding task constraints from human demonstrations, and more. There are endless opportunities for integrating these models within existing algorithm frameworks or new neurosymbolic approaches to generalize planning capabilities to previously considered intractable problems.


2025

  1. arXiv
    pachs.gif
    Parallel Heuristic Search as Inference for Actor-Critic Reinforcement Learning Models
    Under Review
  2. arXiv
    lookasyouleap.jpg
    Look as You Leap: Planning Simultaneous Motion and Perception for High-DoF Robots
    Under Review
  3. arXiv
    actsim.gif
    Parallel Simulation of Contact and Actuation for Soft Growing Robots
    Under Review
  4. arXiv
    graspdiff.gif
    Variational Shape Inference for Grasp Diffusion on SE(3)
    Under Review
  5. Workshop
    fast_bc.jpg
    Faster Behavior Cloning with Hardware-Accelerated Motion Planning
    In IEEE ICRA 2025 Workshop—RoboARCH: Robotics Acceleration with Computing Hardware and Systems

2024

  1. Abstract
    Perception-aware Planning for Robotics: Challenges and Opportunities
    In 40th Anniversary of the IEEE Conference on Robotics and Automation (ICRA@40)
  2. Stochastic Implicit Neural Signed Distance Functions for Safe Motion Planning under Sensing Uncertainty
    In IEEE International Conference on Robotics and Automation

2023

  1. Object Reconfiguration with Simulation-Derived Feasible Actions
    In IEEE International Conference on Robotics and Automation

2021

  1. Learning Sampling Distributions Using Local 3D Workspace Decompositions for Motion Planning in High Dimensions
    In IEEE International Conference on Robotics and Automation