• Muhammed Bilgin
  • Publications
  • Blog Posts
  • Reading Log
  • Talks
  • Contact and Cv
    Muhammed Bilgin

    Muhammed Bilgin

    Research Scientist

    Machine Learning and Computational Science
    I am a member of the Istanbul University computational sciences group ciml.group

    My interests; NLP, Neural Networks, GNN and Tensor

    • Istanbul, Turkey
    • Email
    • Github
    • Google Scholar
    • ORCID

    21-09-2022 Daily Reading Log

    Updated: September 21, 2022

    Due to my business life, I could not find time to read, write and produce for a long time. During this time, besides the tiredness of business life, I took some time to listen to myself. That’s why I decided to write my first blog post Creating and producing time. During this time, I share the contents that I read and take notes without any date information. I did not have the chance to read much, but I can say that the contents I read contributed a lot to my knowledge.

    • Einops: Clear and Reliable Tensor Manipulations with Einstein-like Notation
    • Pretrained Transformers as Universal Computation Engines
    • Branch Specialization
    • The Unreasonable Effectiveness of Recurrent Neural Networks
    • Hybrid Modeling
    • Maximum Likelihood Estimation
    • Academic freedom, academic integrity, and ethical review in NLP
    • REAL ML: Recognizing, Exploring, and Articulating Limitations of Machine Learning Research
    • Maximum likelihood estimation for the regression parameters
    • Organize and Document your Machine Learning (or any) Research Project with Notion
    • Estimating Custom Maximum Likelihood Models in Python (and Matlab)
    • 76. Maximum Likelihood Estimation¶
    • Generalized Language Models
    • Accelerating NLP through research awards and more
    • Reducing Toxicity in Language Models
    • XOR-TyDi Cross-lingual Open-Retrieval Question Answering
    Previous Next
    Sitemap
    • Follow:
    • Twitter
    • GitHub
    • Feed
    © 2023 Muhammed Bilgin. Powered by Jekyll & Minimal Mistakes.