Researcher Directory System

NAGAI Isamu
Professor
Last Updated :2025/03/25

Researcher Profile and Settings

Profile and Settings

Name

  • Name

    NAGAI Isamu

Profile & Settings

Affiliations

Affiliation (Master)

  • Professor

Education, Etc.

Education

  • Apr. 2009, Mar. 2012, Hiroshima University, Graduate School of Science, Department of Mathematics, Japan
  • Apr. 2007, Mar. 2009, Hiroshima University, Graduate School of Science, Department of Mathematics, Japan

Degree

  • Mar. 2012
  • Mar. 2009

その他基本情報

Association Memberships

  • Japanese Society of Computational Statistics
  • Japanese Scoienty of Applied Statistics
  • Japan Statistical Society

Research Activities

Research Areas, Etc.

Research Areas

  • Statistics

Research Interests

  • Penalized Estimation Method
  • Longitudinal data
  • Multivariate Model

Book, papers, etc

Published Papers

  • Ridge parameter optimization using a modified Cp statistic in multivariate generalized ridge regression for the GMANOVA model
    1651, 1660, 2023, refereed, Joint Work
  • Implications of the usage of three-mode principal component analysis with a fixed polynomial basis
    Monden, R., Nagai, I. & Yanagihara, H.
    Smart Innov. Syst. Tec., 214, 224, 2023, refereed, Joint Work
  • Estimation algorithms for MLE of three-mode GMANOVA model with Kronecker product covariance matrix
    Horikawa, K., Nagai, I., Monden, R. & Yanagihara, H.
    Smart Innov. Syst. Tec., 203, 213, 2023, refereed, Joint Work
  • Modified Cp criterion in widely applicable models
    Yanagihara, H., Nagai, I., Fukui, K. & Hijikawa, Y.
    Smart Innov. Syst. Tec., 173, 182, 2023, refereed, Joint Work
  • Information Criterion-Based Nonhierarchical Clustering
    I. Nagai, K. Takahashi, and H. Yanagihara
    International Journal of Knowledge Engineering and Soft Data Paradigms, 6, 1, 43, 2017, refereed, Joint Work, In the analysis of actual data, it is important to determine whether there are clusters in the data. This can be done using one of several methods of cluster analysis, which can be roughly divided into hierarchical and nonhierarchical clustering methods. Nonhierarchical clustering can be applied to more types of data than can hierarchical clustering (see e.g., Saito and Yadohisa, 2006), and hence, in this paper, we focus on nonhierarchical clustering. In nonhierarchical clustering, the results heavily depend on the number of clusters, and thus it is very important to select the appropriate number of clusters. Bozdogan (1986) and Manning et al. (2009, Section 16.4.1) used formal information criteria, e.g., Aakaike's information criterion (AIC) and so on, for selecting the number of clusters. In this paper, we verify that such formal information criteria work poorly for selecting the number of clusters by conducting numerical examinations. Hence, we extend a formal AIC by adding a new penalty term, and search for an additional penalty with an acceptable selection-performance through numerical experiments.
  • Optimization of Penalty Parameter in Penalized Nonlinear Canonical Correlation Analysis by using Cross-Validation
    I. Nagai
    Journal of Mathematics and Statistics, 11, 99, 106, 2015, refereed, Single Work, There is Canonical Correlation Analysis (CCA) as a way to find a linear relationship between a pair of random vectors. However, CCA cannot find a nonlinear relationship between them since the method maximizes the correlation between linear combinations of the vectors. In order to find the nonlinear relationship, we convert the vectors through some known conversion functions like a kernel function. Then we find the nonlinear relationship in the original vectors through the conversion function. However, this method has a critical issue in that the maximized correlation sometimes becomes 1 even if there is no relationship between the random vectors. Some author proposed a penalized method with a penalty parameter that avoids this issue when the kernel functions are used for conversion. In this method, however, methods have not been proposed for optimizing the penalty and other hyper parameters in the conversion function, even though the results heavily depend on these parameters. In this study, we propose an optimization method for the penalty and other parameters, based on the simple cross-validation method.
  • Choosing the Number of Repetitions in the Multiple Plug-in Optimization Method for the Ridge Parameters in Multivariate Generalized Ridge Regression
    I. Nagia, K. Fukui, and H. Yanagihara
    Bulletin of Informatics and Cybermetics, 45, 25, 35, 2013, refereed, Joint Work, ultivariate generalized ridge (MGR) regression was proposed by Yanagihara, Nagai, and Satoh (2009) in order to avoid the multicollinearity problem in multivariate linear regression models. The MGR estimator is defined by using multiple nonnegative ridge parameters in an ordinary least-squares (LS) estimator. In order to optimize these ridge parameters, Yanagihara, Nagai, and Satoh (2009) and Nagai, Yanagihara, and Satoh (2012) proposed several optimization methods. We focus on the plug-in optimization method, which is an estimation method for the principal optimal ridge parameters that minimizes the predicted mean squared error. The plug-in optimization method is a repeating method that uses the current ridge parameters estimate as input in order to obtain an improved estimate. In the present paper, we propose two criteria for choosing the number of repetitions. We conducted several numerical studies using the proposed information criteria to compare the LS and MGR estimators with the optimized ridge parameters based on some ordinary plug-in optimization methods, and those obtained by using the optimized multiple plug-in optimization method.
  • Principal Components Regression by using Generalized Principal Components Analysis
    M. Fujiwara, T. Minamidani, I. Nagai, and H. Wakaki
    Journal of Japan Statistics Society, 43, 57, 78, 2013, refereed, Joint Work, Principal components analysis (PCA) is one method for reducing the dimension of the explanatory variables, although the principal components are derived by using all the explanatory variables. Several authors have proposed a modified PCA (MPCA), which is based on using only selected explanatory variables in order to obtain the principal components (see e.g., Jolliffie (1972, 1986), Robert and Escoufier (1976), Tanaka and Mori (1997)). However, MPCA uses all of the selected explanatory variables to obtain the principal components. There may, therefore, be extra variables for some of the principal components. Hence, in the present paper, we propose a generalized PCA (GPCA) by extending the partitioning of the explanatory variables. In this paper, we estimate the unknown vector in the linear regression model based on the result of a GPCA. We also propose some improvements in the method to reduce the computational cost.
  • Selection of Model Selection Criteria for Multivariate Ridge Regression
    I. Nagai
    Hiroshima Mathematical Journal, 43, 73, 106, 2013, refereed, Single Work, In the present study, we consider the selection of model selection criteria for multivariate ridge regression. There are several model selection criteria for selecting the ridge parameter in multivariate ridge regression, e.g., the Cp criterion and the modified Cp (MCp) criterion. We propose the generalized Cp (GCp) criterion, which includes Cp and MCp criteria as special cases. The GCp criterion is specified by a non-negative parameter,which is referred to as the penalty parameter. We attempt to select an optimal penalty parameter such that the predicted mean square error (PMSE) of the predictor of ridge regression after optimizing the ridge parameter is minimized. Through numerical experiments, we verify that the proposed optimization methods exhibit better performance than conventional optimization methods, i.e., optimizing only the ridge parameter by minimizing the Cp or MCp criterion.
  • Modified Cp Criterion for Optimizing Ridge and Smooth Parameters in the MGR Estimator for the Nonparametric GMANOVA model
    I. Nagai
    Open Journal of Statistics, 1, 1, 14, 2011, refereed, Single Work, Longitudinal trends of observations can be estimated using the generalized multivariate analysis of variance (GMANOVA) model proposed by [10]. In the present paper, we consider estimating the trends nonparametrically using known basis functions. Then, as in nonparametric regression, an overfitting problem occurs. [13] showed that the GMANOVA model is equivalent to the varying coefficient model with non-longitudinal covariates. Hence, as in the case of the ordinary linear regression model, when the number of covariates becomes large, the estimator of the varying coefficient becomes unstable. In the present paper, we avoid the overfitting problem and the instability problem by applying the concept behind penalized smoothing spline regression and multivariate generalized ridge regression. In addition, we propose two criteria to optimize hyper parameters, namely, a smoothing parameter and ridge parameters. Finally, we compare the ordinary least square estimator and the new estimator.
  • A bias-corrected Cp criterion for optimizing ridge parameters in multivariate generalized ridge regression
    H. Yanagihara, i. Nagai, and K. Satoh
    Japanese Journal of Applied Statistics, 38, 151, 172, 2009, refereed, Joint Work, In a ridge regression for an univariate linear regression model, it is common that an optimal ridge parameter is determined by minimizing an information criterion, e.g., Mallows' Cp criterion (Mallows (1973, 1995)). Since the solution to the minimization problem of the information criterion is not expressed by a closed form, an additional computational task is required. On the other hand, a generalized ridge regression proposed by Hoerl and Kennard (1970) has multiple ridge parameters, but optimal ridge parameters are obtained by closed forms. In this paper, we extend the generalized ridge regression to a multivariate linear regression case. Then, Cp criterion for optimizing ridge parameters in the multivariate generalized ridge regression is considered as an estimator of a risk function based on the mean square error of prediction. By correcting a bias of the Cp criterion completely, a bias-corrected Cp criterion named by modified Cp (MCp) criterion is proposed. It is analytically proved that the proposed MCp has not only smaller bias but also smaller variance than an existing Cp criterion and is the uniformly minimum variance unbiased estimator of the risk function. We show that the criterion has useful properties by means of numerical experiments.

Conference Activities & Talks

  • Estimation methods for Three-mode GMANOVA Model with Unobserved Design Matrices
    Monden Rei
    IASC-ARS Interim 2024, Joint Work
  • Joint Work
  • Coordinate Descent Algorithm of the Group Lasso for Selecting Between-Individual Explanatory Variables in the Three-Mode GMANOVA Model
    Joint Work, refereed
  • Joint Work
  • Joint Work
  • Joint Work, refereed
  • Implications of the Usage of Three-mode Principal Component Analysis with a Fixed Polynomial Basis
    Rei Monden, Isamu Nagai, Hirokazu Yanagihara
    KES IDT 2023, Joint Work, refereed
  • Estimation Algorithms for MLE of Three-mode GMANOVA Model with Kronecker Product Covariance Matrix
    Keito Horikawa, Isamu Nagai, Rei Monden, Hirokazu Yanagihara
    KES IDT 2023, Joint Work, refereed
  • Modified Cp Criterion in Widely Applicable Models
    Hirokazu Yanagihara, Isamu Nagai, Keisuke Fukui, Yuta Hijikawa
    KES IDT 2023, Joint Work, refereed
  • Shrinkage estimator for the precision matrix and its optimization in closed form
    Isamu Nagai
    Data Science, Statistics and Visualization 2019, Single Work
  • Single Work
  • Joint Work
  • Single Work
  • A Bias-Corrected Cp criterion for optimizing ridge parameters in Multivariate Generalized Ridge Regression for GMANOVA model
    I. Nagai
    Hiroshima Statistics Study Group, 2010, Single Work
  • Optimizations of Ridge Parameters for Multivariate Generalized Ridge Regression
    I. Nagai
    Hiroshima Statistics Study Group, 2008, Single Work, Radiation Effects Research Foundation, A generalized ridge (GR) regression for an univariate linear model was proposed by Hoerl and Kennard (1970). In this talk, we consider GR regression for a multivariate linear model, which named a multivariate GR (MGR) regression. Many authors gave noniterative GR estimators by using optimal ridge parameters. By expanding their optimizations of ridge parameters to multiple response case, we derive noniterative MGR estimators. Especially, by assuming multivariate normality to response variables, an unbiased Mallows' Cp criterion for selecting ridge parameters in MGR regression is proposed. We compare with derived MGR estimators by conducting numerical studies.

Other Research Activities

Others

  • Development of Analysis for high-dimensional three-mode data
  • Application and visualization of high-dimensional longitudinal data
  • 2019


Copyright © MEDIA FUSION Co.,Ltd. All rights reserved.