Website:
http://www.ms.k.u-tokyo.ac.jp/sugi/index-jp.html
Papers:
“Do We Need Zero Training Loss After Achieving Zero Training Error?” (2020) Proc ICML2020
“Mitigating Overfitting in Supervised Classification from Two Unlabeled Datasets: A Consistent Risk Correction Approach” (2020) Proc AISTATS2020
“Positive-Unlabeled Learning with Non-Negative Risk Estimator” (2017) Neural Information Processing Systems
“Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation” (2012) Ann Inst Stat Math
“Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis” (2007) Journal of Machine Learning Research
Media:
人工知能研究のこれまでとこれから(東進タイムズ)
【AIの社会実装と応用③】進化続けるプログラム 杉山将教授(東大新聞ONLINE)
http://www.ms.k.u-tokyo.ac.jp/sugi/index-jp.html
Papers:
“Do We Need Zero Training Loss After Achieving Zero Training Error?” (2020) Proc ICML2020
“Mitigating Overfitting in Supervised Classification from Two Unlabeled Datasets: A Consistent Risk Correction Approach” (2020) Proc AISTATS2020
“Positive-Unlabeled Learning with Non-Negative Risk Estimator” (2017) Neural Information Processing Systems
“Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation” (2012) Ann Inst Stat Math
“Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis” (2007) Journal of Machine Learning Research
Media:
人工知能研究のこれまでとこれから(東進タイムズ)
【AIの社会実装と応用③】進化続けるプログラム 杉山将教授(東大新聞ONLINE)