Abstract: Piecewise Deterministic Markov Processes (PDMPs) describe deterministic dynamical systems whose parameters undergo random jumps, making them versatile tools for modeling complex stochastic phenomena. Yet, simulating their trajectories can be computationally demanding. For a broad class of inference problems, an optimal sampling strategy can be characterized in terms of a generalized "committor function". We introduce a new adaptive importance sampling method designed to efficiently generate rare PDMP trajectories. The approach unfolds in two stages. First, in an offline phase, the PDMP is coarse-grained into a simpler graph-based process, enabling explicit computation of key quantities and yielding a low-cost approximation of the committor function. Then, in an online phase, trajectories are sampled from a distribution guided by this approximation and iteratively improved via cross-entropy minimization. We provide asymptotic guarantees for the method and demonstrate its effectiveness through the estimation of the failure probability of a complex industrial system.