搜索
人大经济论坛 标签 transform 相关帖子

tag 标签: transform经管大学堂:名校名师名课

相关帖子

版块 作者 回复/查看 最后发表
ICCV:VidTr Video Transformer Without Convolutions attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 102 Mujahida 2025-8-11 15:27:44
ICCV:Visual Saliency Transformer attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 97 Mujahida 2025-8-11 15:10:55
ICCV:Visual Transformers Where Do Transformers Really Belong in Vision Models attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 91 Mujahida 2025-8-11 14:54:04
ICCV:Vision-Language Transformer and Query Generation for Referring Segmentation attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 91 Mujahida 2025-8-11 14:48:23
ICCV:WB-DETR Transformer-Based Detector Without Backbone attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 104 Mujahida 2025-8-11 14:14:38
ICCV:Vision Transformers for Dense Prediction attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 83 Mujahida 2025-8-11 13:46:31
ICCV:ViViT A Video Vision Transformer attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 94 Mujahida 2025-8-11 13:40:55
ICCV:Voxel Transformer for 3D Object Detection attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 131 Mujahida 2025-8-11 13:32:30
新四级冲刺需牢记的700核心词 attachment 经管文库(原现金交易版) Luce2030 2025-8-11 0 96 Luce2030 2025-8-11 09:01:59
ICML-Catformer Designing Stable Transformers via Sensitivity Analysis attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 72 Mujahida 2025-8-11 07:53:19
计算视觉SE(3)-Transformers 3D Roto-Translation Equivariant Attention Networks attachment 经管文库(原现金交易版) 2023Hua 2025-8-11 0 149 2023Hua 2025-8-11 07:48:25
ICML-ConViT Improving Vision Transformers with Soft Convolutional Inductive ... attachment 经管文库(原现金交易版) Mujahida 2025-8-11 0 73 Mujahida 2025-8-11 07:43:51
计算视觉Self-Supervised Graph Transformer on Large-Scale Molecular Data attachment 经管文库(原现金交易版) Barda-2025 2025-8-10 0 70 Barda-2025 2025-8-10 19:17:53
计算视觉Self-Learning Transformations for Improving Gaze and Head Redirection attachment 经管文库(原现金交易版) Barda-2025 2025-8-10 0 73 Barda-2025 2025-8-10 19:04:21
ICML-CATE Computation-aware Neural Architecture Encoding with Transformers attachment 经管文库(原现金交易版) Mujahida 2025-8-10 0 135 Mujahida 2025-8-10 18:02:56
ICML-Differentiable Spatial Planning using Transformers attachment 经管文库(原现金交易版) Mujahida 2025-8-10 0 87 Mujahida 2025-8-10 17:15:33
ICML-Linear Transformers Are Secretly Fast Weight Programmers attachment 经管文库(原现金交易版) Mujahida 2025-8-10 0 73 Mujahida 2025-8-10 13:20:34
ICML-LieTransformer Equivariant Self-Attention for Lie Groups attachment 经管文库(原现金交易版) Mujahida 2025-8-10 0 108 Mujahida 2025-8-10 09:06:02
Woodbury Transformations for Deep Generative Flows attachment 经管文库(原现金交易版) Kaka-2030 2025-8-8 0 66 Kaka-2030 2025-8-8 11:50:38
国家自科结题时间和成果计算 数据交流中心 ewfwedwd 2025-8-5 0 6102 ewfwedwd 2025-8-5 18:26:50
GMT+8, 2026-2-25 12:49