楼主: nandehutu2022
382 0

[电气工程与系统科学] 识别SAR图像和光学图像中对应的斑块 伪暹罗CNN [推广有奖]

  • 0关注
  • 4粉丝

会员

学术权威

75%

还不是VIP/贵宾

-

威望
10
论坛币
10 个
通用积分
65.6496
学术水平
0 点
热心指数
0 点
信用等级
0 点
经验
24498 点
帖子
4088
精华
0
在线时间
1 小时
注册时间
2022-2-24
最后登录
2022-4-20

楼主
nandehutu2022 在职认证  发表于 2022-3-9 09:31:20 来自手机 |只看作者 |坛友微信交流群|倒序 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
摘要翻译:
在这封信中,我们提出了一个伪暹罗卷积神经网络(CNN)体系结构,能够解决在甚高分辨率(VHR)光学和合成孔径雷达(SAR)遥感图像中识别相应斑块的任务。利用两个并行网络流中的八个卷积层,一个用于融合每个流中学习到的特征的全连通层,以及一个基于二元交叉熵的损失函数,实现了两个补丁对应与否的一热指示。该网络是在一个自动生成的数据集上训练和测试的,该数据集基于SAR和光学图像通过先前重建和随后共同注册的3D点云的确定性对齐。从卫星图像中提取组成我们数据集的斑块,显示了一个复杂的城市场景,其中包含许多高架物体(即建筑物),因此提供了最困难的实验环境之一。实验结果表明,该网络能够以较高的精度预测相应的斑块,为进一步发展广义多传感器关键点匹配方法提供了很大的潜力。索引术语-合成孔径雷达(SAR),光学图像,数据融合,深度学习,卷积神经网络(CNN),图像匹配,深度匹配
---
英文标题:
《Identifying Corresponding Patches in SAR and Optical Images with a
  Pseudo-Siamese CNN》
---
作者:
Lloyd H. Hughes, Michael Schmitt, Lichao Mou, Yuanyuan Wang, and Xiao
  Xiang Zhu
---
最新提交年份:
2018
---
分类信息:

一级分类:Electrical Engineering and Systems Science        电气工程与系统科学
二级分类:Image and Video Processing        图像和视频处理
分类描述:Theory, algorithms, and architectures for the formation, capture, processing, communication, analysis, and display of images, video, and multidimensional signals in a wide variety of applications. Topics of interest include: mathematical, statistical, and perceptual image and video modeling and representation; linear and nonlinear filtering, de-blurring, enhancement, restoration, and reconstruction from degraded, low-resolution or tomographic data; lossless and lossy compression and coding; segmentation, alignment, and recognition; image rendering, visualization, and printing; computational imaging, including ultrasound, tomographic and magnetic resonance imaging; and image and video analysis, synthesis, storage, search and retrieval.
用于图像、视频和多维信号的形成、捕获、处理、通信、分析和显示的理论、算法和体系结构。感兴趣的主题包括:数学,统计,和感知图像和视频建模和表示;线性和非线性滤波、去模糊、增强、恢复和重建退化、低分辨率或层析数据;无损和有损压缩编码;分割、对齐和识别;图像渲染、可视化和打印;计算成像,包括超声、断层和磁共振成像;以及图像和视频的分析、合成、存储、搜索和检索。
--
一级分类:Computer Science        计算机科学
二级分类:Computer Vision and Pattern Recognition        计算机视觉与模式识别
分类描述:Covers image processing, computer vision, pattern recognition, and scene understanding. Roughly includes material in ACM Subject Classes I.2.10, I.4, and I.5.
涵盖图像处理、计算机视觉、模式识别和场景理解。大致包括ACM课程I.2.10、I.4和I.5中的材料。
--

---
英文摘要:
  In this letter, we propose a pseudo-siamese convolutional neural network (CNN) architecture that enables to solve the task of identifying corresponding patches in very-high-resolution (VHR) optical and synthetic aperture radar (SAR) remote sensing imagery. Using eight convolutional layers each in two parallel network streams, a fully connected layer for the fusion of the features learned in each stream, and a loss function based on binary cross-entropy, we achieve a one-hot indication if two patches correspond or not. The network is trained and tested on an automatically generated dataset that is based on a deterministic alignment of SAR and optical imagery via previously reconstructed and subsequently co-registered 3D point clouds. The satellite images, from which the patches comprising our dataset are extracted, show a complex urban scene containing many elevated objects (i.e. buildings), thus providing one of the most difficult experimental environments. The achieved results show that the network is able to predict corresponding patches with high accuracy, thus indicating great potential for further development towards a generalized multi-sensor key-point matching procedure. Index Terms-synthetic aperture radar (SAR), optical imagery, data fusion, deep learning, convolutional neural networks (CNN), image matching, deep matching
---
PDF链接:
https://arxiv.org/pdf/1801.08467
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:CNN SAR Architecture Segmentation Presentation neural corresponding optical 包含 学习

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加JingGuanBbs
拉您进交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-5-26 09:20