Posts by Collection

portfolio

publications

  • Mass Enhanced Node Embeddings for Drug Repurposing
  • Michail Chatzianastasis, Giannis Nikolentzos, Michalis Vazirgiannis
    Published: ICML 2022 Workshop on Computational Biology and 12th EETN Conference on Artificial Intelligence (SETN 2022)

    We propose a node embedding algorithm for the problem of drug repurposing. The proposed algorithm learns node representations that capture the influence of nodes in the biological network by learning a mass term for each node along with its embedding. Read more

  • Weisfeiler and Leman go Hyperbolic: Learning Distance Preserving Node Representations
  • Giannis Nikolentzos, Michail Chatzianastasis, Michalis Vazirgiannis
    Published: AISTATS 2023

    In this paper, we define a distance function between nodes which is based on the hierarchy produced by the WL algorithm, and propose a model that learns representations which preserve those distances between nodes. Since the emerging hierarchy corresponds to a tree, to learn these representations, we capitalize on recent advances in the field of hyperbolic neural networks. Read more

  • Graph Ordering Attention Networks
  • Michail Chatzianastasis, Johannes Lutzeyer, George Dasoulas, Michalis Vazirgiannis
    Published: AAAI 2023

    We introduce the Graph Ordering Attention (GOAT) layer, a novel GNN component that learns local node orderings via an attention mechanism and processes the ordered representations using a recurrent neural network aggregator. Read more

  • Prot2Text: Multimodal Protein’s Function Generation with GNNs and Transformers
  • Hadi Abdine, Michail Chatzianastasis, Costas Bouyioukos, Michalis Vazirgiannis
    Published: AAAI 2024, Spotlight at DGM4H Neurips 2023 and AI4Science Neurips 2023

    We propose Prot2Text, which predicts a protein function's in a free text style, moving beyond the conventional binary or categorical classifications. By combining Graph Neural Networks(GNNs) and Large Language Models(LLMs), in an encoder-decoder framework, our model effectively integrates diverse data types including proteins' sequences, structures, and textual annotations. Read more

    talks

    teaching