Materials Map

Discover the materials research landscape. Find experts, partners, networks.

  • About
  • Privacy Policy
  • Legal Notice
  • Contact

The Materials Map is an open tool for improving networking and interdisciplinary exchange within materials research. It enables cross-database search for cooperation and network partners and discovering of the research landscape.

The dashboard provides detailed information about the selected scientist, e.g. publications. The dashboard can be filtered and shows the relationship to co-authors in different diagrams. In addition, a link is provided to find contact information.

×

Materials Map under construction

The Materials Map is still under development. In its current state, it is only based on one single data source and, thus, incomplete and contains duplicates. We are working on incorporating new open data sources like ORCID to improve the quality and the timeliness of our data. We will update Materials Map as soon as possible and kindly ask for your patience.

To Graph

1.080 Topics available

To Map

977 Locations available

693.932 PEOPLE
693.932 People People

693.932 People

Show results for 693.932 people that are selected by your search filters.

←

Page 1 of 27758

→
←

Page 1 of 0

→
PeopleLocationsStatistics
Naji, M.
  • 2
  • 13
  • 3
  • 2025
Motta, Antonella
  • 8
  • 52
  • 159
  • 2025
Aletan, Dirar
  • 1
  • 1
  • 0
  • 2025
Mohamed, Tarek
  • 1
  • 7
  • 2
  • 2025
Ertürk, Emre
  • 2
  • 3
  • 0
  • 2025
Taccardi, Nicola
  • 9
  • 81
  • 75
  • 2025
Kononenko, Denys
  • 1
  • 8
  • 2
  • 2025
Petrov, R. H.Madrid
  • 46
  • 125
  • 1k
  • 2025
Alshaaer, MazenBrussels
  • 17
  • 31
  • 172
  • 2025
Bih, L.
  • 15
  • 44
  • 145
  • 2025
Casati, R.
  • 31
  • 86
  • 661
  • 2025
Muller, Hermance
  • 1
  • 11
  • 0
  • 2025
Kočí, JanPrague
  • 28
  • 34
  • 209
  • 2025
Šuljagić, Marija
  • 10
  • 33
  • 43
  • 2025
Kalteremidou, Kalliopi-ArtemiBrussels
  • 14
  • 22
  • 158
  • 2025
Azam, Siraj
  • 1
  • 3
  • 2
  • 2025
Ospanova, Alyiya
  • 1
  • 6
  • 0
  • 2025
Blanpain, Bart
  • 568
  • 653
  • 13k
  • 2025
Ali, M. A.
  • 7
  • 75
  • 187
  • 2025
Popa, V.
  • 5
  • 12
  • 45
  • 2025
Rančić, M.
  • 2
  • 13
  • 0
  • 2025
Ollier, Nadège
  • 28
  • 75
  • 239
  • 2025
Azevedo, Nuno Monteiro
  • 4
  • 8
  • 25
  • 2025
Landes, Michael
  • 1
  • 9
  • 2
  • 2025
Rignanese, Gian-Marco
  • 15
  • 98
  • 805
  • 2025

Banerjee, Pragyan

  • Google
  • 1
  • 6
  • 2

in Cooperation with on an Cooperation-Score of 37%

Topics

Publications (1/1 displayed)

  • 2023Image inpainting in acoustic microscopy2citations

Places of action

Chart of shared publication
Prasad, Dilip K.
1 / 1 shared
Habib, Anowarul
1 / 10 shared
Yadav, Nitin
1 / 1 shared
Agarwal, Krishna
1 / 1 shared
Melandsø, Frank
1 / 6 shared
Mishra, Sibasish
1 / 1 shared
Chart of publication period
2023

Co-Authors (by relevance)

  • Prasad, Dilip K.
  • Habib, Anowarul
  • Yadav, Nitin
  • Agarwal, Krishna
  • Melandsø, Frank
  • Mishra, Sibasish
OrganizationsLocationPeople

article

Image inpainting in acoustic microscopy

  • Prasad, Dilip K.
  • Habib, Anowarul
  • Yadav, Nitin
  • Banerjee, Pragyan
  • Agarwal, Krishna
  • Melandsø, Frank
  • Mishra, Sibasish
Abstract

<jats:p>Scanning acoustic microscopy (SAM) is a non-ionizing and label-free imaging modality used to visualize the surface and internal structures of industrial objects and biological specimens. The image of the sample under investigation is created using high-frequency acoustic waves. The frequency of the excitation signals, the signal-to-noise ratio, and the pixel size all play a role in acoustic image resolution. We propose a deep learning-enabled image inpainting for acoustic microscopy in this paper. The method is based on training various generative adversarial networks (GANs) to inpaint holes in the original image and generate a 4× image from it. In this approach, five different types of GAN models are used: AOTGAN, DeepFillv2, Edge-Connect, DMFN, and Hypergraphs image inpainting. The trained model’s performance is assessed by calculating the peak signal-to-noise ratio (PSNR) and structural similarity index measure (SSIM) between network-predicted and ground truth images. The Hypergraphs image inpainting model provided an average SSIM of 0.93 for 2× and up to 0.93 for the final 4×, respectively, and a PSNR of 32.33 for 2× and up to 32.20 for the final 4×. The developed SAM and GAN frameworks can be used in a variety of industrial applications, including bio-imaging.</jats:p>

Topics
  • impedance spectroscopy
  • surface
  • scanning auger microscopy