All Papers
34,598 papers found • Page 103 of 692
Conference
DisPose: Disentangling Pose Guidance for Controllable Human Image Animation
Hongxiang Li, Yaowei Li, Yuhang Yang et al.
DiSRT-In-Bed: Diffusion-Based Sim-to-Real Transfer Framework for In-Bed Human Mesh Recovery
Jing Gao, Ce Zheng, Laszlo Jeni et al.
Disrupting Model Merging: A Parameter-Level Defense Without Sacrificing Accuracy
JUNHAO WEI, YU ZHE, Jun Sakuma
Dissecting Adversarial Robustness of Multimodal LM Agents
Chen Wu, Rishi Shah, Jing Yu Koh et al.
Dissecting and Mitigating Diffusion Bias via Mechanistic Interpretability
Yingdong Shi, Changming Li, Yifan Wang et al.
Dissecting Generalized Category Discovery: Multiplex Consensus under Self-Deconstruction
Luyao Tang, Kunze Huang, Yuxuan Yuan et al.
Dissecting Submission Limit in Desk-Rejections: A Mathematical Analysis of Fairness in AI Conference Policies
Yuefan Cao, Xiaoyu Li, Yingyu Liang et al.
Diss-l-ECT: Dissecting Graph Data with Local Euler Characteristic Transforms
Julius Von Rohrscheidt, Bastian Rieck
DiST-4D: Disentangled Spatiotemporal Diffusion with Metric Depth for 4D Driving Scene Generation
Jiazhe Guo, Yikang Ding, Xiwu Chen et al.
Distance Adaptive Beam Search for Provably Accurate Graph-Based Nearest Neighbor Search
Yousef Al-Jazzazi, Haya Diwan, Jinrui Gou et al.
Distance-Based Tree-Sliced Wasserstein Distance
Viet-Hoang Tran, Minh-Khoi Nguyen-Nhat, Trang Pham et al.
Distance-informed Neural Processes
Aishwarya Venkataramanan, Joachim Denzler
Distances Between Top-Truncated Elections of Different Sizes
Piotr Faliszewski, Jitka Mertlová, Pierre Nunn et al.
Distances for Markov chains from sample streams
Sergio Calo, Anders Jonsson, Gergely Neu et al.
DISTA-Net: Dynamic Closely-Spaced Infrared Small Target Unmixing
Shengdong Han, Shangdong Yang, Yuxuan Li et al.
DISTIL: Data-Free Inversion of Suspicious Trojan Inputs via Latent Diffusion
Hossein Mirzaei, Zeinab Taghavi, Sepehr Rezaee et al.
Distil-E2D: Distilling Image-to-Depth Priors for Event-Based Monocular Depth Estimation
Jie Long Lee, Gim Hee Lee
Distillation of Discrete Diffusion through Dimensional Correlations
Satoshi Hayakawa, Yuhta Takida, Masaaki Imaizumi et al.
Distillation Robustifies Unlearning
Bruce W, Lee, Addie Foote, Alex Infanger et al.
Distillation Scaling Laws
Dan Busbridge, Amitis Shidani, Floris Weers et al.
DistillDrive: End-to-End Multi-Mode Autonomous Driving Distillation by Isomorphic Hetero-Source Planning Model
Rui Yu, Xianghang Zhang, Runkai Zhao et al.
Distilled Decoding 1: One-step Sampling of Image Auto-regressive Models with Flow Matching
Enshu Liu, Xuefei Ning, Yu Wang et al.
Distilled Decoding 2: One-step Sampling of Image Auto-regressive Models with Conditional Score Distillation
Enshu Liu, Qian Chen, Xuefei Ning et al.
Distilled Prompt Learning for Incomplete Multimodal Survival Prediction
Yingxue Xu, Fengtao ZHOU, Chenyu Zhao et al.
DistillHGNN: A Knowledge Distillation Approach for High-Speed Hypergraph Neural Networks
Saman Forouzandeh, Parham Moradi Dowlatabadi, Mahdi Jalili
Distilling Dataset into Neural Field
Donghyeok Shin, HeeSun Bae, Gyuwon Sim et al.
Distilling Diffusion Models to Efficient 3D LiDAR Scene Completion
shengyuan zhang, An Zhao, Ling Yang et al.
Distilling Knowledge from Heterogeneous Architectures for Semantic Segmentation
Yanglin Huang, Kai Hu, Yuan Zhang et al.
Distilling LLM Agent into Small Models with Retrieval and Code Tools
Minki Kang, Jongwon Jeong, Seanie Lee et al.
Distilling LLM Prior to Flow Model for Generalizable Agent’s Imagination in Object Goal Navigation
Badi Li, Ren-Jie Lu, Yu Zhou et al.
Distilling Long-tailed Datasets
Zhenghao Zhao, Haoxuan Wang, Yuzhang Shang et al.
Distilling Monocular Foundation Model for Fine-grained Depth Completion
Yingping Liang, Yutao Hu, Wenqi Shao et al.
Distilling Multi-modal Large Language Models for Autonomous Driving
Deepti Hegde, Rajeev Yasarla, Hong Cai et al.
Distilling Parallel Gradients for Fast ODE Solvers of Diffusion Models
Beier Zhu, Ruoyu Wang, Tong Zhao et al.
Distilling Reinforcement Learning Algorithms for In-Context Model-Based Planning
Jaehyeon Son, Soochan Lee, Gunhee Kim
Distilling Spatially-Heterogeneous Distortion Perception for Blind Image Quality Assessment
Xudong Li, Wenjie Nie, Yan Zhang et al.
Distilling Spectral Graph for Object-Context Aware Open-Vocabulary Semantic Segmentation
Chanyoung Kim, Dayun Ju, Woojung Han et al.
Distilling Structural Representations into Protein Sequence Models
Jeffrey Ouyang-Zhang, Chengyue Gong, Yue Zhao et al.
Distilling Structured Rationale from Large Language Models to Small Language Models for Abstractive Summarization
Linyong Wang, Lianwei Wu, Shaoqi Song et al.
Distilling the Knowledge in Data Pruning
Emanuel Ben Baruch, Adam Botach, Igor Kviatkovsky et al.
DistiLLM-2: A Contrastive Approach Boosts the Distillation of LLMs
Jongwoo Ko, Tianyi Chen, Sungnyun Kim et al.
DisTime: Distribution-based Time Representation for Video Large Language Models
yingsen zeng, Zepeng Huang, Yujie Zhong et al.
DistinctAD: Distinctive Audio Description Generation in Contexts
Bo Fang, Wenhao Wu, Qiangqiang Wu et al.
Distinguishing Cause from Effect with Causal Velocity Models
Johnny Xi, Hugh Dance, Peter Orbanz et al.
Distinguish Then Exploit: Source-free Open Set Domain Adaptation via Weight Barcode Estimation and Sparse Label Assignment
Weiming Liu, Jun Dan, Fan Wang et al.
Dist Loss: Enhancing Regression in Few-Shot Region through Distribution Distance Constraint
Guangkun Nie, Gongzheng Tang, Shenda Hong
Distortion of AI Alignment: Does Preference Optimization Optimize for Preferences?
Paul Gölz, Nika Haghtalab, Kunhe Yang
Distraction is All You Need for Multimodal Large Language Model Jailbreaking
Zuopeng Yang, Jiluan Fan, Anli Yan et al.
Distributed Conformal Prediction via Message Passing
Haifeng Wen, Hong XING, Osvaldo Simeone
Distributed Differentially Private Data Analytics via Secure Sketching
Jakob Burkhardt, Hannah Keller, Claudio Orlandi et al.