These are my cats.
ZHAO Tianyu 趙天雨
E-mail: zhaoty.ting[at]gmail.com
Updated on 2026/Jan/23
I am a research engineer working on language models at Sakana AI.
Prior to Sakana AI, I developed multiple Japanese LLMs at rinna as a researcher.
Work Experience
Research Engineer @ Sakana AI
2024.05 - present
| Language Model
Researcher @ rinna Co., Ltd.
2020.10 - 2024.04
| Alignment, Dialogue, and LLM
Research SDE Intern @ Microsoft Development, rinna Team
2019.10 - 2019.12
| Pre-trained models for Japanese dialogues
Education Experience
Ph.D. @ Kyoto University
2017.10 - 2020.9
| Intelligence Science and Technology
| Supervisor: Tatsuya Kawahara
M.Eng. @ Kyoto University
2015.10 - 2017.9
| Intelligence Science and Technology
| Supervisor: Tatsuya Kawahara
B.Sc. @ Peking University
2011.9 - 2015.7
| Computer Science and Technology
| Supervisor: Yunfang Wu
Selected Publications
| TyZ and Llion Jones
| paper code (coming soon) arxiv
RePo: language models with context Re-Positioning
| Huayang Li, TyZ, and Richard Sproat
| paper code blog arxiv
Reinforcement learning teachers of test time scaling
| Edoardo Cetin, TyZ, and Yujin Tang
| paper code blog NeurIPS 2025
Sudoku-Bench: Evaluating creative reasoning with Sudoku variants
| Jeffery Seely, Yuki Imajuku, TyZ, Edoardo Cetin, and Llion Jones
| paper code blog dataset arxiv
Large language models to diffusion finetuning
| Edoardo Cetin, TyZ, and Yujin Tang
| paper code ICML 2025
An evolved universal transformer memory
| Edoardo Cetin, Qi Sun, TyZ, and Yujin Tang
| paper code blog ICLR 2025
Release of pre-trained models for the Japanese language
| Kei Sawada, TyZ, Makoto Shing, Kentaro Mitsui, Akio Kaga, Yukiya Hono, Toshiaki Wakatsuki, and Koh Mitsuda
| paper LREC-COLING 2024
Talks
Nekomata: State-of-the-Art Japanese LLM based on Qwen
2024/01/30
| AI Forward: Alibaba Cloud AI & Big Data Summit 2024 @ Singapore
| Link
日本語LLMの最先端
2023/06/29
| Weights and Biases Tokyo Meetup #5 @ Tokyo
| Link